829 resultados para Conceptual designs
Resumo:
: Noncommunicable diseases (NCDs) account for a growing burden of morbidity and mortality among people living with HIV in low- and middle-income countries (LMICs). HIV infection and antiretroviral therapy interact with NCD risk factors in complex ways, and research into this "web of causation" has so far been largely based on data from high-income countries. However, improving the understanding, treatment, and prevention of NCDs in LMICs requires region-specific evidence. Priority research areas include: (1) defining the burden of NCDs among people living with HIV, (2) understanding the impact of modifiable risk factors, (3) evaluating effective and efficient care strategies at individual and health systems levels, and (4) evaluating cost-effective prevention strategies. Meeting these needs will require observational data, both to inform the design of randomized trials and to replace trials that would be unethical or infeasible. Focusing on Sub-Saharan Africa, we discuss data resources currently available to inform this effort and consider key limitations and methodological challenges. Existing data resources often lack population-based samples; HIV-negative, HIV-positive, and antiretroviral therapy-naive comparison groups; and measurements of key NCD risk factors and outcomes. Other challenges include loss to follow-up, competing risk of death, incomplete outcome ascertainment and measurement of factors affecting clinical decision making, and the need to control for (time-dependent) confounding. We review these challenges and discuss strategies for overcoming them through augmented data collection and appropriate analysis. We conclude with recommendations to improve the quality of data and analyses available to inform the response to HIV and NCD comorbidity in LMICs.
Resumo:
This chapter introduces a conceptual model to combine creativity techniques with fuzzy cognitive maps (FCMs) and aims to support knowledge management methods by improving expert knowledge acquisition and aggregation. The aim of the conceptual model is to represent acquired knowledge in a manner that is as computer-understandable as possible with the intention of developing automated reasoning in the future as part of intelligent information systems. The formal represented knowledge thus may provide businesses with intelligent information integration. To this end, we introduce and evaluate various creativity techniques with a list of attributes to define the most suitable to combine with FCMs. This proposed combination enables enhanced knowledge management through the acquisition and representation of expert knowledge with FCMs. Our evaluation indicates that the creativity technique known as mind mapping is the most suitable technique in our set. Finally, a scenario from stakeholder management demonstrates the combination of mind mapping with FCMs as an integrated system.
Resumo:
This article reconceptualizes shared rule and uses novel data to measure it, thus addressing two shortcomings of federal literature. First, while most studies focus on self-rule, one question that is largely neglected is how lower-level governments can influence politics at a higher level in the absence of “second” chambers. The answer is through shared rule. A second shortcoming is that even when addressing this question, scholars concentrate on constitutional-administrative aspects of vertical intergovernmentalism, neglecting more informal, “political” dynamics. Comparing the twenty-six Swiss cantons allows drawing two lessons for federal studies: That shared rule is multifaceted and complex, and that to study informal territorial actors as well as direct political processes is indispensable to understand how power is actually distributed in federal political systems.
Resumo:
Factorial designs for clinical trials are often encountered in medical, dental, and orthodontic research. Factorial designs assess two or more interventions simultaneously and the main advantage of this design is its efficiency in terms of sample size as more than one intervention may be assessed on the same participants. However, the factorial design is efficient only under the assumption of no interaction (no effect modification) between the treatments under investigation and, therefore, this should be considered at the design stage. Conversely, the factorial study design may also be used for the purpose of detecting an interaction between two interventions if the study is powered accordingly. However, a factorial design powered to detect an interaction has no advantage in terms of the required sample size compared to a multi-arm parallel trial for assessing more than one intervention. It is the purpose of this article to highlight the methodological issues that should be considered when planning, analysing, and reporting the simplest form of this design, which is the 2 × 2 factorial design. An example from the field of orthodontics using two parameters (bracket type and wire type) on maxillary incisor torque loss will be utilized in order to explain the design requirements, the advantages and disadvantages of this design, and its application in orthodontic research.
Resumo:
The aim of this study was to improve cage systems for maintaining adult honey bee (Apis mellifera L.) workers under in vitro laboratory conditions. To achieve this goal, we experimentally evaluated the impact of different cages, developed by scientists of the international research network COLOSS (Prevention of honey bee COlony LOSSes), on the physiology and survival of honey bees. We identified three cages that promoted good survival of honey bees. The bees from cages that exhibited greater survival had relatively lower titers of deformed wing virus, suggesting that deformed wing virus is a significant marker reflecting stress level and health status of the host. We also determined that a leak- and drip-proof feeder was an integral part of a cage system and a feeder modified from a 20-ml plastic syringe displayed the best result in providing steady food supply to bees. Finally, we also demonstrated that the addition of protein to the bees' diet could significantly increase the level ofvitellogenin gene expression and improve bees' survival. This international collaborative study represents a critical step toward improvement of cage designs and feeding regimes for honey bee laboratory experiments.
Resumo:
The literature on career adaptation is vast and based on a range of different measurement approaches. The present paper aims to explore how different operationalizations of career adaptability in terms of concern, control, curiosity, and confidence are related from a conceptual and empirical standpoint. Based on a cross-sectional analysis with 1260 German university students, we established that the adaptability resources of concern, control, curiosity, and confidence are significantly related to, but empirically distinct from, measures representing adapting in terms of career planning, career decision-making difficulties, career exploration, and occupational self-efficacy. In a follow-up survey six months later, we found that the career adaptability dimensions partially mediated the effects of adaptivity (i.e., core self-evaluations and proactivity) on planning, decision-making difficulties, exploration, and self-efficacy. Interestingly, in both analyses, there was no clear match between adaptability resources and theoretically corresponding aspects of career adapting in terms of behaviors, beliefs, and barriers. The results suggest that psychological career resources in terms of concern, control, curiosity, and confidence partially mediate the effects of more context-general, trait-like adaptivity on different career-specific behavioral forms of adapting.
Resumo:
Lumbar spinal instability (LSI) is a common spinal disorder and can be associated with substantial disability. The concept of defining clinically relevant classifications of disease or 'target condition' is used in diagnostic research. Applying this concept to LSI we hypothesize that a set of clinical and radiological criteria can be developed to identify patients with this target condition who are at high risk of 'irreversible' decompensated LSI for whom surgery becomes the treatment of choice. In LSI, structural deterioration of the lumbar disc initiates a degenerative cascade of segmental instability. Over time, radiographic signs become visible: traction spurs, facet joint degeneration, misalignment, stenosis, olisthesis and de novo scoliosis. Ligaments, joint capsules, local and distant musculature are the functional elements of the lumbar motion segment. Influenced by non-functional factors, these functional elements allow a compensation of degeneration of the motion segment. Compensation may happen on each step of the degenerative cascade but cannot reverse it. However, compensation of LSI may lead to an alleviation or resolution of clinical symptoms. In return, the target condition of decompensation of LSI may cause the new occurrence of symptoms and pain. Functional compensation and decompensation are subject to numerous factors that can change which makes estimation of an individual's long-term prognosis difficult. Compensation and decompensation may influence radiographic signs of degeneration, e.g. the degree of misalignment and segmental angulation caused by LSI is influenced by the tonus of the local musculature. This conceptual model of compensation/decompensation may help solve the debate on functional and psychosocial factors that influence low back pain and to establish a new definition of non-specific low back pain. Individual differences of identical structural disorders could be explained by compensated or decompensated LSI leading to changes in clinical symptoms and pain. Future spine surgery will have to carefully define and measure functional aspects of LSI, e.g. to identify a point of no return where multidisciplinary interventions do not allow a re-compensation and surgery becomes the treatment of choice.
Resumo:
XENON is a dark matter direct detection project, consisting of a time projection chamber (TPC) filled with liquid xenon as detection medium. The construction of the next generation detector, XENON1T, is presently taking place at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy. It aims at a sensitivity to spin-independent cross sections of 2 10-47 c 2 for WIMP masses around 50 GeV2, which requires a background reduction by two orders of magnitude compared to XENON100, the current generation detector. An active system that is able to tag muons and muon-induced backgrounds is critical for this goal. A water Cherenkov detector of ~ 10 m height and diameter has been therefore developed, equipped with 8 inch photomultipliers and cladded by a reflective foil. We present the design and optimization study for this detector, which has been carried out with a series of Monte Carlo simulations. The muon veto will reach very high detection efficiencies for muons (>99.5%) and showers of secondary particles from muon interactions in the rock (>70%). Similar efficiencies will be obtained for XENONnT, the upgrade of XENON1T, which will later improve the WIMP sensitivity by another order of magnitude. With the Cherenkov water shield studied here, the background from muon-induced neutrons in XENON1T is negligible.
Resumo:
Numerous studies reported a strong link between working memory capacity (WMC) and fluid intelligence (Gf), although views differ in respect to how close these two constructs are related to each other. In the present study, we used a WMC task with five levels of task demands to assess the relationship between WMC and Gf by means of a new methodological approach referred to as fixed-links modeling. Fixed-links models belong to the family of confirmatory factor analysis (CFA) and are of particular interest for experimental, repeated-measures designs. With this technique, processes systematically varying across task conditions can be disentangled from processes unaffected by the experimental manipulation. Proceeding from the assumption that experimental manipulation in a WMC task leads to increasing demands on WMC, the processes systematically varying across task conditions can be assumed to be WMC-specific. Processes not varying across task conditions, on the other hand, are probably independent of WMC. Fixed-links models allow for representing these two kinds of processes by two independent latent variables. In contrast to traditional CFA where a common latent variable is derived from the different task conditions, fixed-links models facilitate a more precise or purified representation of the WMC-related processes of interest. By using fixed-links modeling to analyze data of 200 participants, we identified a non-experimental latent variable, representing processes that remained constant irrespective of the WMC task conditions, and an experimental latent variable which reflected processes that varied as a function of experimental manipulation. This latter variable represents the increasing demands on WMC and, hence, was considered a purified measure of WMC controlled for the constant processes. Fixed-links modeling showed that both the purified measure of WMC (β = .48) as well as the constant processes involved in the task (β = .45) were related to Gf. Taken together, these two latent variables explained the same portion of variance of Gf as a single latent variable obtained by traditional CFA (β = .65) indicating that traditional CFA causes an overestimation of the effective relationship between WMC and Gf. Thus, fixed-links modeling provides a feasible method for a more valid investigation of the functional relationship between specific constructs.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.