876 resultados para Allocation of resources
Resumo:
The concept of platform switching has been introduced to implant dentistry based on clinical observations of reduced peri-implant crestal bone loss. However, published data are controversial, and most studies are limited to 12 months. The aim of the present randomized clinical trial was to test the hypothesis that platform switching has a positive impact on crestal bone-level changes after 3 years. Two implants with a diameter of 4 mm were inserted crestally in the posterior mandible of 25 patients. The intraindividual allocation of platform switching (3.3-mm platform) and the standard implant (4-mm platform) was randomized. After 3 months of submerged healing, single-tooth crowns were cemented. Patients were followed up at short intervals for monitoring of healing and oral hygiene. Statistical analysis for the influence of time and platform type on bone levels employed the Brunner-Langer model. At 3 years, the mean radiographic peri-implant bone loss was 0.69 ± 0.43 mm (platform switching) and 0.74 ± 0.57 mm (standard platform). The mean intraindividual difference was 0.05 ± 0.58 mm (95% confidence interval: -0.19, 0.29). Crestal bone-level alteration depended on time (p < .001) but not on platform type (p = .363). The present randomized clinical trial could not confirm the hypothesis of a reduced peri-implant crestal bone loss, when implants had been restored according to the concept of platform switching.
Resumo:
This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.
Resumo:
Body weight (BW) and blood pressure (BP) have a close relationship, which has been accounted for by hormonal changes. No previous study has evaluated the effect of wearing an external weight vest on BP to determine whether there is a simple mechanism between BW and BP. Seventeen healthy volunteers underwent weight reduction (WR) through caloric restriction. Before and after WR, BW, body fat percentage and BP at rest and during exercise were measured. Before and after WR, exercise testing was performed twice with the random allocation of a weight vest (10 kg) during one of the tests. Linear regression was used to detect independent associations between BP and the weight vest, BW and body fat percentage. BW decreased from 89.4 ± 15.4 kg to 79.1 ± 14.0 kg following WR (P<0.001). WR led to significant decreases in BP at rest (from 130.0/85.9 mm Hg to 112.5/77.8 mm Hg, P<0.001 for systolic and diastolic BPs) and during exercise. The weight vest significantly increased BP at rest (to 136.1/90.7 mm Hg before and 125.8/84.6 mm Hg after WR) and during exercise. Linear regression analysis identified an independent association between the weight vest and BP (P=0.006 for systolic BP and P=0.009 for diastolic BP at rest). This study demonstrates that wearing an external weight vest has immediate effects on BP at rest and during exercise independent of BW or body fat. More research is needed to understand the physiological mechanisms between weight and BP.
Resumo:
An often-cited reason for studying the process of invasion by alien species is that the understanding sought can be used to mitigate the impacts of the invaders. Here, we present an analysis of the correlates of local impacts of established alien bird and mammal species in Europe, using a recently described metric to quantify impact. Large-bodied, habitat generalist bird and mammal species that are widespread in their native range, have the greatest impacts in their alien European ranges, supporting our hypothesis that surrogates for the breadth and the amount of resources a species uses are good indicators of its impact. However, not all surrogates are equally suitable. Impacts are generally greater for mammal species giving birth to larger litters, but in contrast are greater for bird species laying smaller clutches. There is no effect of diet breadth on impacts in birds or mammals. On average, mammals have higher impacts than birds. However, the relationships between impact and several traits show common slopes for birds and mammals, and relationships between impact and body mass and latitude do not differ between birds and mammals. These results may help to anticipate which species would have large impacts if introduced, and so direct efforts to prevent such introductions.
Resumo:
During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network).
Resumo:
In several regions of the world, climate change is expected to have severe impacts on agricultural systems. Changes in land management are one way to adapt to future climatic conditions, including land-use changes and local adjustments of agricultural practices. In previous studies, options for adaptation have mostly been explored by testing alternative scenarios. Systematic explorations of land management possibilities using optimization approaches were so far mainly restricted to studies of land and resource management under constant climatic conditions. In this study, we bridge this gap and exploit the benefits of multi-objective regional optimization for identifying optimum land management adaptations to climate change. We design a multi-objective optimization routine that integrates a generic crop model and considers two climate scenarios for 2050 in a meso-scale catchment on the Swiss Central Plateau with already limited water resources. The results indicate that adaptation will be necessary in the study area to cope with a decrease in productivity by 0–10 %, an increase in soil loss by 25–35 %, and an increase in N-leaching by 30–45 %. Adaptation options identified here exhibit conflicts between productivity and environmental goals, but compromises are possible. Necessary management changes include (i) adjustments of crop shares, i.e. increasing the proportion of early harvested winter cereals at the expense of irrigated spring crops, (ii) widespread use of reduced tillage, (iii) allocation of irrigated areas to soils with low water-retention capacity at lower elevations, and (iv) conversion of some pre-alpine grasslands to croplands.
Resumo:
We study a real-world scheduling problem arising in the context of a rolling ingots production. First we review the production process and discuss peculiarities that have to be observed when scheduling a given set of production orders on the production facilities. We then show how to model this scheduling problem using prescribed time lags between operations, different kinds of resources, and sequence-dependent changeovers. A branch-and-bound solution procedure is presented in the second part. The basic principle is to relax the resource constraints by assuming infinite resource availability. Resulting resource conflicts are then stepwise resolved by introducing precedence relationships among operations competing for the same resources. The algorithm has been implemented as a beam search heuristic enumerating alternative sets of precedence relationships.
Resumo:
•Symbioses between plant roots and mycorrhizal fungi are thought to enhance plant uptake of nutrients through a favourable exchange for photosynthates. Ectomycorrhizal fungi are considered to play this vital role for trees in nitrogen (N)-limited boreal forests. •We followed symbiotic carbon (C)–N exchange in a large-scale boreal pine forest experiment by tracing 13CO2 absorbed through tree photosynthesis and 15N injected into a soil layer in which ectomycorrhizal fungi dominate the microbial community. •We detected little 15N in tree canopies, but high levels in soil microbes and in mycorrhizal root tips, illustrating effective soil N immobilization, especially in late summer, when tree belowground C allocation was high. Additions of N fertilizer to the soil before labelling shifted the incorporation of 15N from soil microbes and root tips to tree foliage. •These results were tested in a model for C–N exchange between trees and mycorrhizal fungi, suggesting that ectomycorrhizal fungi transfer small fractions of absorbed N to trees under N-limited conditions, but larger fractions if more N is available. We suggest that greater allocation of C from trees to ectomycorrhizal fungi increases N retention in soil mycelium, driving boreal forests towards more severe N limitation at low N supply.
Resumo:
The Contested Floodplain tells the story of institutional changes in the management of common pool resources (pasture, wildlife, and fisheries) among Ila and Balundwe agro-pastoralists and Batwa fishermen in the Kafue Flats, in southern Zambia. It explains how and why a once rich floodplain area, managed under local common property regimes, becomes a poor man’s place and a degraded resource area. Based on social anthropological field research, the book explains how well working institutions in the past, regulating communal access to resources, have turned into state property and open access or privatization. The study focuses on the historic developments taking place since pre-colonial and colonial times up to today. Haller shows how the commons had been well regulated by local institutions in the past, often embedded in religious belief systems. He then explains the transformation from common property to state property since colonial times. When the state is unable to provide well-functioning institutions due to a lack in financial income, it contributes to de facto open access and degradation of the commons. The Zambian copper-based economy has faced crisis since 1975, and many Zambians have to look for economic alternatives and find ways to profit from the lack of state control (a paradox of the present-absent state). And while the state is absent, external actors use the ideology of citizenship to justify free use of resources during conflicts with local people. Also within Zambian communities, floodplain resources are highly contested, which is illustrated through conflicts over a proposed irrigation scheme in the area.
Resumo:
The desire to promote efficient allocation of health resources and effective patient care has focused attention on home care as an alternative to acute hospital service. in particular, clinical home care is suggested as a substitute for the final days of hospital stay. This dissertation evaluates the relationship between hospital and home care services for residents of British Columbia, Canada beginning in 1993/94 using data from the British Columbia Linked Health database. ^ Lengths of stay for patients referred to home care following hospital discharge are compared to those for patients not referred to home care. Ordinary least squares regression analysis adjusts for age, gender, admission severity, comorbidity, complications, income, and other patient, physician, and hospital characteristics. Home care clients tend to have longer stays in hospital than patients not referred to home care (β = 2.54, p = 0.0001). Longer hospital stays are evident for all home care client groups as well as both older and younger patients. Sensitivity analysis for referral time to direct care and extreme lengths of stay are consistent with these findings. Two stage regression analysis indicates that selection bias is not significant.^ Patients referred to clinical home care also have different health service utilization following discharge compared to patients not referred to home care. Home care nursing clients use more medical services to complement home care. Rehabilitation clients initially substitute home care for physiotherapy services but later are more likely to be admitted to residential care. All home care clients are more likely to be readmitted to hospital during the one year follow-up period. There is also a strong complementary association between direct care referral and homemaker support. Rehabilitation clients have a greater risk of dying during the year following discharge. ^ These results suggest that home care is currently used as a complement rather than a substitute for some acute health services. Organizational and resource issues may contribute to the longer stays by home care clients. Program planning and policies are required if home care is to provide an effective substitute for acute hospital days. ^
Resumo:
A population-genetic analysis is performed of a two-locus two-allele model, in which the primary locus has a major effect on a quantitative trait that is under frequency-dependent disruptive selection caused by intraspecific competition for a continuum of resources. The modifier locus determines the degree of dominance at the trait level. We establish the conditions when a modifier allele can invade and when it becomes fixed if sufficiently frequent. In general, these are not equivalent because an unstable internal equilibrium may exist and the condition for successful invasion of the modifier is more restrictive than that for eventual fixation from already high frequency. However, successful invasion implies global fixation, i.e., fixation from any initial condition. Modifiers of large effect can become fixed, and also invade, in a wider parameter range than modifiers of small effect. We also study modifiers with a direct, frequency-independent deleterious fitness effect. We show that they can invade if they induce a sufficiently high level of dominance and if disruptive selection on the ecological trait is strong enough. For deleterious modifiers, successful invasion no longer implies global fixation because they can become stuck at an intermediate frequency due to a stable internal equilibrium. Although the conditions for invasion and for fixation if sufficiently frequent are independent of the linkage relation between the two loci, the rate of spread depends strongly on it. The present study provides further support to the view that evolution of dominance may be an efficient mechanism to remove unfit heterozygotes that are maintained by balancing selection. It also demonstrates that an invasion analysis of mutants of very small effect is insufficient to obtain a full understanding of the evolutionary dynamics under frequency-dependent selection.
Resumo:
We study the evolution of higher levels of dominance as a response to negative frequency-dependent selection. In contrast to previous studies, we focus on the effect of assortative mating on the evolution of dominance under frequency-dependent intraspecific competition. We analyze a two-locus two-allele model, in which the primary locus has a major effect on a quantitative trait that is under a mixture of frequency-independent stabilizing selection, density-dependent selection, and frequency-dependent selection caused by intraspecific competition for a continuum of resources. The second (modifier) locus determines the degree of dominance at the trait level. Additionally, the population mates assortatively with respect to similarities in the ecological trait. Our analysis shows that the parameter region in which dominance can be established decreases if small levels of assortment are introduced. In addition, the degree of dominance that can be established also decreases. In contrast, if assortment is intermediate, sexual selection for extreme types can be established, which leads to evolution of higher levels of dominance than under random mating. For modifiers with large effects, intermediate levels of assortative mating are most favorable for the evolution of dominance. For large modifiers, the speed of fixation can even be higher for intermediate levels of assortative mating than for random mating.
Resumo:
The goal of the current investigation was to compare two monitoring processes (judgments of learning [JOLs] and confidence judgments [CJs]) and their corresponding control processes (allocation of study time and selection of answers to maximize accuracy, respectively) in 5- to 7-year-old children (N=101). Children learned the meaning of Japanese characters and provided JOLs after a study phase and CJs after a memory test. They were given the opportunity to control their learning in self-paced study phases, and to control their accuracy by placing correct answers into a treasure chest and incorrect answers into a trash can. All three age groups gave significantly higher CJs for correct compared to incorrect answers, with no age-related differences in the magnitude of this difference, suggesting robust metacognitive monitoring skills in children as young as 5. Furthermore, a link between JOLs and study time was found in the 6- and 7-year-olds, such that children spent more time studying items with low JOLs compared to items with high JOLs. Also, 6- and 7-year-olds but not 5-year-olds spent more time studying difficult items compared to easier items. Moreover, age-related improvements were found in children's use of CJs to guide their selection of answers: although children as young as 5 placed their most confident answers in the treasure chest and least confident answers in the trash can, this pattern was more robust in older children. Overall, results support the view that some metacognitive judgments may be acted upon with greater ease than others among young children.
Resumo:
Recent studies have demonstrated that the improved prognosis derived from resection of gliomas largely depends on the extent and quality of the resection, making maximum but safe resection the ultimate goal. Simultaneously, technical innovations and refined neurosurgical methods have rapidly improved efficacy and safety. Because gliomas derive from intrinsic brain cells, they often cannot be visually distinguished from the surrounding brain tissue during surgery. In order to appreciate the full extent of their solid compartment, various technologies have recently been introduced. However, radical resection of infiltrative glioma puts neurological function at risk, with potential detrimental consequences for patients' survival and quality of life. The allocation of various neurological functions within the brain varies in each patient and may undergo additional changes in the presence of a tumour (brain plasticity), making intra-operative localisation of eloquent areas mandatory for preservation of essential brain functions. Combining methods that visually distinguish tumour tissue and detect tissues responsible for critical functions now enables resection of tumours in brain regions that were previously considered off-limits, and benefits patients by enabling a more radical resection, while simultaneously lowering the risk of neurological deficits. Here we review recent and expected developments in microsurgery for glioma and their respective benefits.