162 resultados para Task-Oriented Methodology
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
How a stimulus or a task alters the spontaneous dynamics of the brain remains a fundamental open question in neuroscience. One of the most robust hallmarks of task/stimulus-driven brain dynamics is the decrease of variability with respect to the spontaneous level, an effect seen across multiple experimental conditions and in brain signals observed at different spatiotemporal scales. Recently, it was observed that the trial-to-trial variability and temporal variance of functional magnetic resonance imaging (fMRI) signals decrease in the task-driven activity. Here we examined the dynamics of a large-scale model of the human cortex to provide a mechanistic understanding of these observations. The model allows computing the statistics of synaptic activity in the spontaneous condition and in putative tasks determined by external inputs to a given subset of brain regions. We demonstrated that external inputs decrease the variance, increase the covariances, and decrease the autocovariance of synaptic activity as a consequence of single node and large-scale network dynamics. Altogether, these changes in network statistics imply a reduction of entropy, meaning that the spontaneous synaptic activity outlines a larger multidimensional activity space than does the task-driven activity. We tested this model's prediction on fMRI signals from healthy humans acquired during rest and task conditions and found a significant decrease of entropy in the stimulus-driven activity. Altogether, our study proposes a mechanism for increasing the information capacity of brain networks by enlarging the volume of possible activity configurations at rest and reliably settling into a confined stimulus-driven state to allow better transmission of stimulus-related information.
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process.
Resumo:
Ever since the inception of economics over two hundred years ago, the tools at the discipline's disposal have grown more and more more sophisticated. This book provides a historical introduction to the methodology of economics through the eyes of economists. The story begins with John Stuart Mill's seminal essay from 1836 on the definition and method of political economy, which is then followed by an examination of how the actual practices of economists changed over time to such an extent that they not only altered their methods of enquiry, but also their self-perception as economists. Beginning as intellectuals and journalists operating to a large extent in the public sphere, they then transformed into experts who developed their tools of research increasingly behind the scenes. No longer did they try to influence policy agendas through public discourse; rather they targeted policymakers directly and with instruments that showed them as independent and objective policy advisors, the tools of the trade changing all the while. In order to shed light on this evolution of economic methodology, this book takes carefully selected snapshots from the discipline's history. It tracks the process of development through the nineteenth and twentieth centuries, analysing the growth of empirical and mathematical modelling. It also looks at the emergence of the experiment in economics, in addition to the similarities and differences between modelling and experimentation. This book will be relevant reading for students and academics in the fields of economic methodology, history of economics, and history and philosophy of the social sciences.
Resumo:
OBJECTIVE: To develop disease-specific recommendations for the diagnosis and management of eosinophilic granulomatosis with polyangiitis (Churg-Strauss syndrome) (EGPA). METHODS: The EGPA Consensus Task Force experts comprised 8 pulmonologists, 6 internists, 4 rheumatologists, 3 nephrologists, 1 pathologist and 1 allergist from 5 European countries and the USA. Using a modified Delphi process, a list of 40 questions was elaborated by 2 members and sent to all participants prior to the meeting. Concurrently, an extensive literature search was undertaken with publications assigned with a level of evidence according to accepted criteria. Drafts of the recommendations were circulated for review to all members until final consensus was reached. RESULTS: Twenty-two recommendations concerning the diagnosis, initial evaluation, treatment and monitoring of EGPA patients were established. The relevant published information on EGPA, antineutrophil-cytoplasm antibody-associated vasculitides, hypereosinophilic syndromes and eosinophilic asthma supporting these recommendations was also reviewed. DISCUSSION: These recommendations aim to give physicians tools for effective and individual management of EGPA patients, and to provide guidance for further targeted research.
Resumo:
The European Forum on Epilepsy Research (ERF2013), which took place in Dublin, Ireland, on May 26-29, 2013, was designed to appraise epilepsy research priorities in Europe through consultation with clinical and basic scientists as well as representatives of lay organizations and health care providers. The ultimate goal was to provide a platform to improve the lives of persons with epilepsy by influencing the political agenda of the EU. The Forum highlighted the epidemiologic, medical, and social importance of epilepsy in Europe, and addressed three separate but closely related concepts. First, possibilities were explored as to how the stigma and social burden associated with epilepsy could be reduced through targeted initiatives at EU national and regional levels. Second, ways to ensure optimal standards of care throughout Europe were specifically discussed. Finally, a need for further funding in epilepsy research within the European Horizon 2020 funding programme was communicated to politicians and policymakers participating to the forum. Research topics discussed specifically included (1) epilepsy in the developing brain; (2) novel targets for innovative diagnostics and treatment of epilepsy; (3) what is required for prevention and cure of epilepsy; and (4) epilepsy and comorbidities, with a special focus on aging and mental health. This report provides a summary of recommendations that emerged at ERF2013 about how to (1) strengthen epilepsy research, (2) reduce the treatment gap, and (3) reduce the burden and stigma associated with epilepsy. Half of the 6 million European citizens with epilepsy feel stigmatized and experience social exclusion, stressing the need for funding trans-European awareness campaigns and monitoring their impact on stigma, in line with the global commitment of the European Commission and with the recommendations made in the 2011 Written Declaration on Epilepsy. Epilepsy care has high rates of misdiagnosis and considerable variability in organization and quality across European countries, translating into huge societal cost (0.2% GDP) and stressing the need for cost-effective programs of harmonization and optimization of epilepsy care throughout Europe. There is currently no cure or prevention for epilepsy, and 30% of affected persons are not controlled by current treatments, stressing the need for pursuing research efforts in the field within Horizon 2020. Priorities should include (1) development of innovative biomarkers and therapeutic targets and strategies, from gene and cell-based therapies to technologically advanced surgical treatment; (2) addressing issues raised by pediatric and aging populations, as well as by specific etiologies and comorbidities such as traumatic brain injury (TBI) and cognitive dysfunction, toward more personalized medicine and prevention; and (3) translational studies and clinical trials built upon well-established European consortia.
Resumo:
The differentiation of workers into morphological subcastes (e.g., soldiers) represents an important evolutionary transition and is thought to improve division of labor in social insects. Soldiers occur in many ant and termite species, where they make up a small proportion of the workforce. A common assumption of worker caste evolution is that soldiers are behavioral specialists. Here, we report the first test of the "rare specialist" hypothesis in a eusocial bee. Colonies of the stingless bee Tetragonisca angustula are defended by a small group of morphologically differentiated soldiers. Contrary to the rare specialist hypothesis, we found that soldiers worked more (+34%-41%) and performed a greater variety of tasks (+23%-34%) than other workers, particularly early in life. Our results suggest a "rare elite" function of soldiers in T. angustula, that is, that they perform a disproportionately large amount of the work. Division of labor was based on a combination of temporal and physical castes, but soldiers transitioned faster from one task to the next. We discuss why the rare specialist assumption might not hold in species with a moderate degree of worker differentiation.
Resumo:
X-ray medical imaging is increasingly becoming three-dimensional (3-D). The dose to the population and its management are of special concern in computed tomography (CT). Task-based methods with model observers to assess the dose-image quality trade-off are promising tools, but they still need to be validated for real volumetric images. The purpose of the present work is to evaluate anthropomorphic model observers in 3-D detection tasks for low-contrast CT images. We scanned a low-contrast phantom containing four types of signals at three dose levels and used two reconstruction algorithms. We implemented a multislice model observer based on the channelized Hotelling observer (msCHO) with anthropomorphic channels and investigated different internal noise methods. We found a good correlation for all tested model observers. These results suggest that the msCHO can be used as a relevant task-based method to evaluate low-contrast detection for CT and optimize scan protocols to lower dose in an efficient way.
Resumo:
BACKGROUND: Many publications report the prevalence of chronic kidney disease (CKD) in the general population. Comparisons across studies are hampered as CKD prevalence estimations are influenced by study population characteristics and laboratory methods. METHODS: For this systematic review, two researchers independently searched PubMed, MEDLINE and EMBASE to identify all original research articles that were published between 1 January 2003 and 1 November 2014 reporting the prevalence of CKD in the European adult general population. Data on study methodology and reporting of CKD prevalence results were independently extracted by two researchers. RESULTS: We identified 82 eligible publications and included 48 publications of individual studies for the data extraction. There was considerable variation in population sample selection. The majority of studies did not report the sampling frame used, and the response ranged from 10 to 87%. With regard to the assessment of kidney function, 67% used a Jaffe assay, whereas 13% used the enzymatic assay for creatinine determination. Isotope dilution mass spectrometry calibration was used in 29%. The CKD-EPI (52%) and MDRD (75%) equations were most often used to estimate glomerular filtration rate (GFR). CKD was defined as estimated GFR (eGFR) <60 mL/min/1.73 m(2) in 92% of studies. Urinary markers of CKD were assessed in 60% of the studies. CKD prevalence was reported by sex and age strata in 54 and 50% of the studies, respectively. In publications with a primary objective of reporting CKD prevalence, 39% reported a 95% confidence interval. CONCLUSIONS: The findings from this systematic review showed considerable variation in methods for sampling the general population and assessment of kidney function across studies reporting CKD prevalence. These results are utilized to provide recommendations to help optimize both the design and the reporting of future CKD prevalence studies, which will enhance comparability of study results.
Resumo:
Purpose To show that differences in the extent to which firms engage in unrelated diversification can be attributed to differences in ownership structure. Methodology/approach We draw on longitudinal data and use a panel analysis specification to test our hypotheses. Findings We find that unrelated diversification destroys value; pressure-sensitive Anglo-American owners in a firm’s equity reduce unrelated diversification, whereas pressure-resistant domestic owners increase unrelated diversification; the greater the firm’s free cash flow, the greater the negative effect of pressure-sensitive Anglo-American owners on unrelated diversification. Research limitations/implications We contribute to corporate governance and strategy research by bringing in owners’ institutional origin as a shaper of owner preferences in particular with regards to unrelated diversification. Future research may expand our investigation to more than one home institutional context, and theorize on institutional origin effects beyond the dichotomy between Anglo-American and non-Anglo-American (not oriented toward shareholder value maximization) owners. Practical implications Policy makers, financial analysts, owners, and managers may want to reflect about the implications of ownership structure, as well as promoting or joining corporations with particular ownership configurations. Social implications A shareholder value-destroying strategy, such as unrelated diversification has adverse consequences for society at large, in terms of opportunity costs, that is, resources could be allocated to value-creating activities instead. Promoting an ownership configuration that creates value should contribute to social welfare. Originality/value Owners may not be exclusively driven by shareholder value maximization, but can be influenced by normative beliefs (biases) stemming from the institutional context they originate from.
Resumo:
Local autonomy is a highly valued feature of good governance. The continuous attempts of many European countries to strengthen the autonomy of local government show the importance given to decentralisation and far-reaching competences at the lowest units of a state. Measuring and comparing local autonomy, however, has proven to be a difficult task. Not only are there diverging ideas about the core elements of local autonomy, there are also considerable difficulties to apply specific concepts to different countries. This project suggests a comprehensive methodology to measure local autonomy. It analyses 39 European countries and reports changes between 1990 and 2014. A network of experts on local government assessed the autonomy of local government of their respective countries on the basis of a common code book. The eleven variables measured are located on seven imensions and can be combined to a "Local Autonomy Index" (LAI). The data show an increase of local autonomy between 1990 and 2005, especially in the new Central and Eastern European countries. Countries with a particularly high degree of local autonomy are Switzerland, the Nordic countries, Germany and Poland.