978 resultados para Core data set
The development of a core outcome set for medicines management interventions in people with dementia
Resumo:
Explanation of Minimum Data Set (MDS), implementation of Section Q, overview of the program, local contacts and functions, Referral Agency information, role and assistance provided by Long-Term care Ombudsman
Resumo:
Information regarding possible questions on Section Q within the Minimum Data Set.
Resumo:
Dataset for publication in PLOS One
Resumo:
Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.
Resumo:
significant amount of Expendable Bathythermograph (XBT) data has been collected in the Mediterranean Sea since 1999 in the framework of operational oceanography activities. The management and storage of such a volume of data poses significant challenges and opportunities. The SeaDataNet project, a pan-European infrastructure for marine data diffusion, provides a convenient way to avoid dispersion of these temperature vertical profiles and to facilitate access to a wider public. The XBT data flow, along with the recent improvements in the quality check procedures and the consistence of the available historical data set are described. The main features of SeaDataNet services and the advantage of using this system for long-term data archiving are presented. Finally, focus on the Ligurian Sea is included in order to provide an example of the kind of information and final products devoted to different users can be easily derived from the SeaDataNet web portal.
Resumo:
OBJECTIVE: The aim of this pilot study was to describe problems in functioning and associated rehabilitation needs in persons with spinal cord injury after the 2010 earthquake in Haiti by applying a newly developed tool based on the International Classification of Functioning, Disability and Health (ICF). DESIGN: Pilot study. SUBJECTS: Eighteen persons with spinal cord injury (11 women, 7 men) participated in the needs assessment. Eleven patients had complete lesions (American Spinal Injury Association Impairment Scale; AIS A), one patient had tetraplegia. METHODS: Data collection included information from the International Spinal Cord Injury Core Data Set and a newly developed needs assessment tool based on ICF Core Sets. This tool assesses the level of functioning, the corresponding rehabilitation need, and required health professional. Data were summarized using descriptive statistics. RESULTS: In body functions and body structures, patients showed typical problems following spinal cord injury. Nearly all patients showed limitations and restrictions in their activities and participation related to mobility, self-care and aspects of social integration. Several environmental factors presented barriers to these limitations and restrictions. However, the availability of products and social support were identified as facilitators. Rehabilitation needs were identified in nearly all aspects of functioning. To address these needs, a multidisciplinary approach would be needed. CONCLUSION: This ICF-based needs assessment provided useful information for rehabilitation planning in the context of natural disaster. Future studies are required to test and, if necessary, adapt the assessment.
Resumo:
The aim of this paper is to measure the returns to human capital. We use a unique data set consisting of matched employer-employee information. Data on individuals' human capital include a set of 26 competences that capture the utilization of workers' skills in a very detailed way. Thus, we can expand the concept of human capital and discuss the type of skills that are more productive in the workplace and, hence, generate a higher payoff for the workers. The rich information on firm's and workplace characteristics allows us to introduce a broad range of controls and to improve previous research in this field. This paper gives evidence that the returns to generic competences differ depending on the position of the worker in the firm. Only numeracy skills are reward independent of the occupational status of the worker. The level of technology used by the firm in the production process does not directly increase workers’ pay, but it influences the pay-off to some of the competences. JEL Classification: J24, J31
Resumo:
This paper provides an axiomatic framework to compare the D-core (the set of undominatedimputations) and the core of a cooperative game with transferable utility. Theorem1 states that the D-core is the only solution satisfying projection consistency, reasonableness (from above), (*)-antimonotonicity, and modularity. Theorem 2 characterizes the core replacing (*)-antimonotonicity by antimonotonicity. Moreover, these axioms alsocharacterize the core on the domain of convex games, totally balanced games, balancedgames, and superadditive games
Resumo:
This paper provides an axiomatic framework to compare the D-core (the set of undominatedimputations) and the core of a cooperative game with transferable utility. Theorem1 states that the D-core is the only solution satisfying projection consistency, reasonableness (from above), (*)-antimonotonicity, and modularity. Theorem 2 characterizes the core replacing (*)-antimonotonicity by antimonotonicity. Moreover, these axioms alsocharacterize the core on the domain of convex games, totally balanced games, balancedgames, and superadditive games
Resumo:
The Greenland NEEM (North Greenland Eemian Ice Drilling) operation in 2010 provided the first opportunity to combine trace-gas measurements by laser spectroscopic instruments and continuous-flow analysis along a freshly drilled ice core in a field-based setting. We present the resulting atmospheric methane (CH4) record covering the time period from 107.7 to 9.5 ka b2k (thousand years before 2000 AD). Companion discrete CH4 measurements are required to transfer the laser spectroscopic data from a relative to an absolute scale. However, even on a relative scale, the high-resolution CH4 data set significantly improves our knowledge of past atmospheric methane concentration changes. New significant sub-millennial-scale features appear during interstadials and stadials, generally associated with similar changes in water isotopic ratios of the ice, a proxy for local temperature. In addition to the midpoint of Dansgaard–Oeschger (D/O) CH4 transitions usually used for cross-dating, sharp definition of the start and end of these events brings precise depth markers (with ±20 cm uncertainty) for further cross-dating with other palaeo- or ice core records, e.g. speleothems. The method also provides an estimate of CH4 rates of change. The onsets of D/O events in the methane signal show a more rapid rate of change than their endings. The rate of CH4 increase associated with the onsets of D/O events progressively declines from 1.7 to 0.6 ppbv yr−1 in the course of marine isotope stage 3. The largest observed rate of increase takes place at the onset of D/O event #21 and reaches 2.5 ppbv yr−1.