925 resultados para Development index
Resumo:
The Iowa Leading Indicators Index (ILII) Annual Assessment and Update assesses how well the ILII has met the goals behind its development, gauges the validity of the existing components, considers additional components that have been suggested along the way, and carries out the annual updates necessary for such an index.
Resumo:
The Iowa Leading Indicators Index (ILII) Annual Assessment and Update assesses how well the ILII has met the goals behind its development, gauges the validity of the existing components, considers additional components that have been suggested along the way, and carries out the annual updates necessary for such an index.
Resumo:
INTRODUCTION: For decades, clinicians dealing with immunocompromised and critically ill patients have perceived a link between Candida colonization and subsequent infection. However, the pathophysiological progression from colonization to infection was clearly established only through the formal description of the colonization index (CI) in critically ill patients. Unfortunately, the literature reflects intense confusion about the pathophysiology of invasive candidiasis and specific associated risk factors. METHODS: We review the contribution of the CI in the field of Candida infection and its development in the 20 years following its original description in 1994. The development of the CI enabled an improved understanding of the pathogenesis of invasive candidiasis and the use of targeted empirical antifungal therapy in subgroups of patients at increased risk for infection. RESULTS: The recognition of specific characteristics among underlying conditions, such as neutropenia, solid organ transplantation, and surgical and nonsurgical critical illness, has enabled the description of distinct epidemiological patterns in the development of invasive candidiasis. CONCLUSIONS: Despite its limited bedside practicality and before confirmation of potentially more accurate predictors, such as specific biomarkers, the CI remains an important way to characterize the dynamics of colonization, which increases early in patients who develop invasive candidiasis.
Resumo:
The prognosis of community-acquired pneumonia ranges from rapid resolution of symptoms and full recovery of functional status to the development of severe medical complications and death. The pneumonia severity index is a rigorously studied prediction rule for prognosis that objectively stratifies patients into quintiles of risk for short-term mortality on the basis of 20 demographic and clinical variables routinely available at presentation. The pneumonia severity index was derived and validated with data on >50,000 patients with community-acquired pneumonia by use of well-accepted methodological standards and is the only pneumonia decision aid that has been empirically shown to safely increase the proportion of patients given treatment in the outpatient setting. Because of its prognostic accuracy, methodological rigor, and effectiveness and safety as a decision aid, the pneumonia severity index has become the reference standard for risk stratification of community-acquired pneumonia
Resumo:
BACKGROUND: Chest pain can be caused by various conditions, with life-threatening cardiac disease being of greatest concern. Prediction scores to rule out coronary artery disease have been developed for use in emergency settings. We developed and validated a simple prediction rule for use in primary care. METHODS: We conducted a cross-sectional diagnostic study in 74 primary care practices in Germany. Primary care physicians recruited all consecutive patients who presented with chest pain (n = 1249) and recorded symptoms and findings for each patient (derivation cohort). An independent expert panel reviewed follow-up data obtained at six weeks and six months on symptoms, investigations, hospital admissions and medications to determine the presence or absence of coronary artery disease. Adjusted odds ratios of relevant variables were used to develop a prediction rule. We calculated measures of diagnostic accuracy for different cut-off values for the prediction scores using data derived from another prospective primary care study (validation cohort). RESULTS: The prediction rule contained five determinants (age/sex, known vascular disease, patient assumes pain is of cardiac origin, pain is worse during exercise, and pain is not reproducible by palpation), with the score ranging from 0 to 5 points. The area under the curve (receiver operating characteristic curve) was 0.87 (95% confidence interval [CI] 0.83-0.91) for the derivation cohort and 0.90 (95% CI 0.87-0.93) for the validation cohort. The best overall discrimination was with a cut-off value of 3 (positive result 3-5 points; negative result <or= 2 points), which had a sensitivity of 87.1% (95% CI 79.9%-94.2%) and a specificity of 80.8% (77.6%-83.9%). INTERPRETATION: The prediction rule for coronary artery disease in primary care proved to be robust in the validation cohort. It can help to rule out coronary artery disease in patients presenting with chest pain in primary care.
Resumo:
Cortical folding (gyrification) is determined during the first months of life, so that adverse events occurring during this period leave traces that will be identifiable at any age. As recently reviewed by Mangin and colleagues(2), several methods exist to quantify different characteristics of gyrification. For instance, sulcal morphometry can be used to measure shape descriptors such as the depth, length or indices of inter-hemispheric asymmetry(3). These geometrical properties have the advantage of being easy to interpret. However, sulcal morphometry tightly relies on the accurate identification of a given set of sulci and hence provides a fragmented description of gyrification. A more fine-grained quantification of gyrification can be achieved with curvature-based measurements, where smoothed absolute mean curvature is typically computed at thousands of points over the cortical surface(4). The curvature is however not straightforward to comprehend, as it remains unclear if there is any direct relationship between the curvedness and a biologically meaningful correlate such as cortical volume or surface. To address the diverse issues raised by the measurement of cortical folding, we previously developed an algorithm to quantify local gyrification with an exquisite spatial resolution and of simple interpretation. Our method is inspired of the Gyrification Index(5), a method originally used in comparative neuroanatomy to evaluate the cortical folding differences across species. In our implementation, which we name local Gyrification Index (lGI(1)), we measure the amount of cortex buried within the sulcal folds as compared with the amount of visible cortex in circular regions of interest. Given that the cortex grows primarily through radial expansion(6), our method was specifically designed to identify early defects of cortical development. In this article, we detail the computation of local Gyrification Index, which is now freely distributed as a part of the FreeSurfer Software (http://surfer.nmr.mgh.harvard.edu/, Martinos Center for Biomedical Imaging, Massachusetts General Hospital). FreeSurfer provides a set of automated reconstruction tools of the brain's cortical surface from structural MRI data. The cortical surface extracted in the native space of the images with sub-millimeter accuracy is then further used for the creation of an outer surface, which will serve as a basis for the lGI calculation. A circular region of interest is then delineated on the outer surface, and its corresponding region of interest on the cortical surface is identified using a matching algorithm as described in our validation study(1). This process is repeatedly iterated with largely overlapping regions of interest, resulting in cortical maps of gyrification for subsequent statistical comparisons (Fig. 1). Of note, another measurement of local gyrification with a similar inspiration was proposed by Toro and colleagues(7), where the folding index at each point is computed as the ratio of the cortical area contained in a sphere divided by the area of a disc with the same radius. The two implementations differ in that the one by Toro et al. is based on Euclidian distances and thus considers discontinuous patches of cortical area, whereas ours uses a strict geodesic algorithm and include only the continuous patch of cortical area opening at the brain surface in a circular region of interest.
Resumo:
A fundamental tenet of neuroscience is that cortical functional differentiation is related to the cross-areal differences in cyto-, receptor-, and myeloarchitectonics that are observed in ex-vivo preparations. An ongoing challenge is to create noninvasive magnetic resonance (MR) imaging techniques that offer sufficient resolution, tissue contrast, accuracy and precision to allow for characterization of cortical architecture over an entire living human brain. One exciting development is the advent of fast, high-resolution quantitative mapping of basic MR parameters that reflect cortical myeloarchitecture. Here, we outline some of the theoretical and technical advances underlying this technique, particularly in terms of measuring and correcting for transmit and receive radio frequency field inhomogeneities. We also discuss new directions in analytic techniques, including higher resolution reconstructions of the cortical surface. We then discuss two recent applications of this technique. The first compares individual and group myelin maps to functional retinotopic maps in the same individuals, demonstrating a close relationship between functionally and myeloarchitectonically defined areal boundaries (as well as revealing an interesting disparity in a highly studied visual area). The second combines tonotopic and myeloarchitectonic mapping to localize primary auditory areas in individual healthy adults, using a similar strategy as combined electrophysiological and post-mortem myeloarchitectonic studies in non-human primates.
Resumo:
BACKGROUND: Mental disorders, common in primary care, are often associated with physical complaints. While exposure to psychosocial stressors and development or presence of principal mental disorders (i.e. depression, anxiety and somatoform disorders defined as multisomatoforme disorders) is commonly correlated, temporal association remains unproven. The study explores the onset of such disorders after exposure to psychosocial stressors in a cohort of primary care patients with at least one physical symptom. METHOD: The cohort study SODA (SOmatization, Depression and Anxiety) was conducted by 21 private-practice GPs and three fellow physicians in a Swiss academic primary care centre. GPs included patients via randomized daily identifiers. Depression, anxiety or somatoform disorders were identified by the full Patient Health Questionnaire (PHQ), a validated procedure to identify mental disorders based on DSM-IV criteria. The PHQ was also used to investigate exposure to psychosocial stressors (before the index consultation and during follow up) and the onset of principal mental disorders after one year of follow up. RESULTS: From November 2004 to July 2005, 1020 patients were screened for inclusion. 627 were eligible and 482 completed the PHQ one year later and were included in the analysis (77%). At one year, prevalence of principal mental disorders was 30/153 (19.6% CI95% 13.6; 26.8) for those initially exposed to a major psychosocial stressor and 26/329 (7.9% CI95% 5.2; 11.4) for those not. Stronger association exists between psychosocial stressors and depression (RR = 2.4) or anxiety (RR = 3.5) than multisomatoforme disorders (RR = 1.8). Patients who are "bothered a lot" (subjective distress) by a stressor are therefore 2.5 times (CI95% 1.5; 4.0) more likely to experience a mental disorder at one year. A history of psychiatric comorbidities or psychological treatment was not a confounding factor for developing a principal mental disorder after exposure to psychosocial stressors. CONCLUSION: This primary care study shows that patients with physical complaints exposed to psychosocial stressors had a higher risk for developing mental disorders one year later. This temporal association opens the field for further research in preventive care for mental diseases in primary care patients.
Resumo:
The objective of the investigation was the development of a test that would readily identify the potential of an aggregate to cause D-cracking because of its susceptivity to critical saturation. A Press-Ur-Meter was modified by replacing the air chamber with a one-inch diameter plastic tube calibrated in milli-. It was concluded that the pore index was sufficiently reliable to determine the D-cracking potential of limestone aggregates in all but a few cases where marginal results were obtained. Consistently poor or good results were always in agreement with established service records or concrete durability testing. In those instances where marginal results are obtained, the results of concrete durability testing should be considered when making the final determination of the D-cracking susceptibility of the aggregate in question. The following applications for the pore index test have been recommended for consideration: concrete durability testing be discontinued in the evaluation process of new aggregate sources with pore index results between 0-20 (Class 2 durability) and over 35 (Class 1) durability; composite aggregates with intermediate pore index results of 20-35 be tested on each stone type to facilitate the possible removal of low durability stone from the production process; and additional investigation should be made to evaluate the possibility of using the test to monitor and upgrade the acceptance of aggregate from sources associated with D-cracking.
Resumo:
Background: General practitioners play a central role in taking deprivation into consideration when caring for patients in primary care. Validated questions to identify deprivation in primary-care practices are still lacking. For both clinical and research purposes, this study therefore aims to develop and validate a standardized instrument measuring both material and social deprivation at an individual level. Methods: The Deprivation in Primary Care Questionnaire (DiPCare-Q) was developed using qualitative and quantitative approaches between 2008 and 2011. A systematic review identified 199 questions related to deprivation. Using judgmental item quality, these were reduced to 38 questions. Two focus groups (primary-care physicians, and primary-care researchers), structured interviews (10 laymen), and think aloud interviews (eight cleaning staff) assured face validity. Item response theory analysis was then used to derive the DiPCare-Q index using data obtained from a random sample of 200 patients who were to complete the questionnaire a second time over the phone. For construct and criterion validity, the final 16 questions were administered to a random sample of 1,898 patients attending one of 47 different private primary-care practices in western Switzerland (validation set) along with questions on subjective social status (subjective SES ladder), education, source of income, welfare status, and subjective poverty. Results: Deprivation was defined in three distinct dimensions (table); material deprivation (eight items), social deprivation (five items) and health deprivation (three items). Item consistency was high in both the derivation (KR20 = 0.827) and the validation set (KR20 = 0.778). The DiPCare-Q index was reliable (ICC = 0.847). For construct validity, we showed the DiPCare-Q index to be correlated to patients' estimation of their position on the subjective SES ladder (rs = 0.539). This position was correlated to both material and social deprivation independently suggesting two separate mechanisms enhancing the feeling of deprivation. Conclusion: The DiPCare-Q is a rapid, reliable and validated instrument useful for measuring both material and social deprivation in primary care. Questions from the DiPCare-Q are easy to use when investigating patients' social history and could improve clinicians' ability to detect underlying social distress related to deprivation.
Resumo:
Early epilepsy is known to worsen the developmental prognosis of young children with a congenital focal brain lesion, but its direct role is often very difficult to delineate from the other variables. This requires prolonged periods of follow-up with simultaneous serial electrophysiological and developmental assessments which are rarely obtained. We studied a male infant with a right prenatal infarct in the territory of the right middle cerebral artery resulting in a left spastic hemiparesis, and an epileptic disorder (infantile spasms with transient right hemihypsarrhythmia and focal seizures) from the age of 7 months until the age of 4 years. Pregnancy and delivery were normal. A dissociated delay of early language acquisition affecting mainly comprehension without any autistic features was documented. This delay was much more severe than usually expected in children with early focal lesions, and its evolution, with catch-up to normal, was correlated with the active phase of the epilepsy. We postulate that the epilepsy specifically amplified a pattern of delayed language emergence, mainly affecting lexical comprehension, reported in children with early right hemisphere damage.
Resumo:
The purpose of this study is to introduce and describe a newly developed index using foot pressure analysis to quantify the degree of equinus gait in children with cerebral palsy before and after injection with botulinum toxin. Data were captured preinjection and 12 weeks postinjection. Ten children aged 2(1/2) to 6(1/2) years took part (5 boys and 5 girls). Three of them had a diagnosis of spastic diplegia and 7 of congenital hemiplegia. In total, 13 limbs were analyzed. After orientation and segmentation of raw pedobarographic data, we determined a dynamic foot pressure index graded 0 to 100 that quantified the relative degree of heel and forefoot contact during stance. These data were correlated (Pearson correlation) with clinical measurements of dorsiflexion at the ankle (on a slow and fast stretch) and video observation (using the Observational Gait Scale). Pedobarograph data were strongly correlated with both the Observational Gait Scale scores (R = 0.79, P < 0.005) and clinical measurements of dorsiflexion on a fast stretch, which is reflective of spasticity (R = 0.70, P < 0.005). We demonstrated the index's sensitivity in detecting changes in spasticity and good correlation with video observations seems to indicate this technique's potential validity. When manipulated and segmented appropriately, and with the development of a simple ordinal index, we found that foot pressure data provided a useful tool in tracking changes in patients with spastic equinus.
Resumo:
The objective of this work was to adapt the CROPGRO model, which is part of the DSSAT system, for simulating the cowpea (Vigna unguiculata) growth and development under soil and climate conditions of the Baixo Parnaíba region, Piauí State, Brazil. In the CROPGRO, only input parameters that define crop species, cultivars, and ecotype were changed in order to characterize the cowpea crop. Soil and climate files were created for the considered site. Field experiments without water deficit were used to calibrate the model. In these experiments, dry matter (DM), leaf area index (LAI), yield components and grain yield of cowpea (cv. BR 14 Mulato) were evaluated. The results showed good fit for DM and LAI estimates. The medium values of R² and medium absolute error (MAE) were, respectively, 0.95 and 264.9 kg ha-1 for DM, and 0.97 and 0.22 for LAI. The difference between observed and simulated values of plant phenology varied from 0 to 3 days. The model also presented good performance for yield components simulation, excluding 100-grain weight, for which the error ranged from 20.9% to 34.3%. Considering the medium values of crop yield in two years, the model presented an error from 5.6%.
Resumo:
The goal of this project was to provide an objective methodology to support public agencies and railroads in making decisions related to consolidation of at-grade rail-highway crossings. The project team developed a weighted-index method and accompanying Microsoft Excel spreadsheet based tool to help evaluate and prioritize all public highway-rail grade crossings systematically from a possible consolidation impact perspective. Factors identified by stakeholders as critical were traffic volume, heavy-truck traffic volume, proximity to emergency medical services, proximity to schools, road system, and out-of-distance travel. Given the inherent differences between urban and rural locations, factors were considered, and weighted, differently, based on crossing location. Application of a weighted-index method allowed for all factors of interest to be included and for these factors to be ranked independently, as well as weighted according to stakeholder priorities, to create a single index. If priorities change, this approach also allows for factors and weights to be adjusted. The prioritization generated by this approach may be used to convey the need and opportunity for crossing consolidation to decision makers and stakeholders. It may also be used to quickly investigate the feasibility of a possible consolidation. Independently computed crossing risk and relative impact of consolidation may be integrated and compared to develop the most appropriate treatment strategies or alternatives for a highway-rail grade crossing. A crossing with limited- or low-consolidation impact but a high safety risk may be a prime candidate for consolidation. Similarly, a crossing with potentially high-consolidation impact as well as high risk may be an excellent candidate for crossing improvements or grade separation. The results of the highway-rail grade crossing prioritization represent a consistent and quantitative, yet preliminary, assessment. The results may serve as the foundation for more rigorous or detailed analysis and feasibility studies. Other pertinent site-specific factors, such as safety, maintenance costs, economic impacts, and location-specific access and characteristics should be considered.
Resumo:
Due to the low workability of slipform concrete mixtures, the science of rheology is not strictly applicable for such concrete. However, the concept of rheological behavior may still be considered useful. A novel workability test method (Vibrating Kelly Ball or VKelly test) that would quantitatively assess the responsiveness of a dry concrete mixture to vibration, as is desired of a mixture suitable for slipform paving, was developed and evaluated. The objectives of this test method are for it to be cost-effective, portable, and repeatable while reporting the suitability of a mixture for use in slipform paving. The work to evaluate and refine the test was conducted in three phases: 1. Assess whether the VKelly test can signal variations in laboratory mixtures with a range of materials and proportions 2. Run the VKelly test in the field at a number of construction sites 3. Validate the VKelly test results using the Box Test developed at Oklahoma State University for slipform paving concrete The data collected to date indicate that the VKelly test appears to be suitable for assessing a mixture’s response to vibration (workability) with a low multiple operator variability. A unique parameter, VKelly Index, is introduced and defined that seems to indicate that a mixture is suitable for slipform paving when it falls in the range of 0.8 to 1.2 in./√s.