835 resultados para common and mixed costs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anxiety of childhood is a common and serious condition. The past decade has seen an increase in treatment-focussed research, with recent trials tending to give greater attention to parents in the treatment process. This review examines the efficacy of family-based cognitive behaviour therapy and attempts to delineate some of the factors that might have an impact on its efficacy. The choice and timing of outcome measure, age and gender of the child, level of parental anxiety, severity and type of child anxiety and treatment format and content are scrutinised. The main conclusions are necessarily tentative, but it seems likely that Family Cognitive Behaviour Therapy (FCBT) is superior to no treatment, and, for some outcome measures, also superior to Child Cognitive Behaviour Therapy (CCBT). Where FCBT is successful, the results are consistently maintained at follow-up. It appears that where a parent is anxious, and this is not addressed, outcomes are less good. However, for children of anxious parents, FCBT is probably more effective than CCBT. What is most clear is that large, well-designed studies, examining these factors alone and in combination, are now needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to investigate the widely held, but largely untested, view that implicit memory (repetition priming) reflects an automatic form of retrieval. Specifically, in Experiment 1 we explored whether a secondary task (syllable monitoring), performed during retrieval, would disrupt performance on explicit (cued recall) and implicit (stem completion) memory tasks equally. Surprisingly, despite substantial memory and secondary costs to cued recall when performed with a syllable-monitoring task, the same manipulation had no effect on stem completion priming or on secondary task performance. In Experiment 2 we demonstrated that even when using a particularly demanding version of the stem completion task that incurred secondary task costs, the corresponding disruption to implicit memory performance was minimal. Collectively, the results are consistent with the view that implicit memory retrieval requires little or no processing capacity and is not seemingly susceptible to the effects of dividing attention at retrieval.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The usefulness of motor subtypes of delirium is unclear due to inconsistency in subtyping methods and a lack of validation with objective measures of activity. The activity of 40 patients was measured over 24 h with a discrete accelerometer-based activity monitor. The continuous wavelet transform (CWT) with various mother wavelets were applied to accelerometry data from three randomly selected patients with DSM-IV delirium that were readily divided into hyperactive, hypoactive, and mixed motor subtypes. A classification tree used the periods of overall movement as measured by the discrete accelerometer-based monitor as determining factors for which to classify these delirious patients. This data used to create the classification tree were based upon the minimum, maximum, standard deviation, and number of coefficient values, generated over a range of scales by the CWT. The classification tree was subsequently used to define the remaining motoric subtypes. The use of a classification system shows how delirium subtypes can be categorized in relation to overall motoric behavior. The classification system was also implemented to successfully define other patient motoric subtypes. Motor subtypes of delirium defined by observed ward behavior differ in electronically measured activity levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular dynamics simulations of the events after the photodissociation of CO in the myoglobin mutant L29F in which leucine is replaced by phenylalanine are reported. Using both classical and mixed quantum-classical molecular dynamics calculations, we observed the rapid motion of CO away from the distal heme pocket to other regions of the protein, in agreement with recent experimental results. The experimentally observed and calculated infrared spectra of CO after dissociation are also in good agreement. We compared the results with data from simulations of WT myoglobin. As the time resolution of experimental techniques is increased, theoretical methods and models can be validated at the atomic scale by direct comparison with experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A rapid thiolytic degradation and cleanup procedure was developed for analyzing tannins directly in chlorophyll-containing sainfoin (Onobrychis viciifolia) plants. The technique proved suitable for complex tannin mixtures containing catechin, epicatechin, gallocatechin, and epigallocatechin flavan-3-ol units. The reaction time was standardized at 60 min to minimize the loss of structural information as a result of epimerization and degradation of terminal flavan-3-ol units. The results were evaluated by separate analysis of extractable and unextractable tannins, which accounted for 63.6−113.7% of the in situ plant tannins. It is of note that 70% aqueous acetone extracted tannins with a lower mean degree of polymerization (mDP) than was found for tannins analyzed in situ. Extractable tannins had between 4 and 29 lower mDP values. The method was validated by comparing results from individual and mixed sample sets. The tannin composition of different sainfoin accessions covered a range of mDP values from 16 to 83, procyanidin/prodelphinidin (PC/PD) ratios from 19.2/80.8 to 45.6/54.4, and cis/trans ratios from 74.1/25.9 to 88.0/12.0. This is the first high-throughput screening method that is suitable for analyzing condensed tannin contents and structural composition directly in green plant tissue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the building industry proceeds in the direction of low impact buildings, research attention is being drawn towards the reduction of carbon dioxide emission and waste. Starting from design and construction to operation and demolition, various building materials are used throughout the whole building lifecycle involving significant energy consumption and waste generation. Building Information Modelling (BIM) is emerging as a tool that can support holistic design-decision making for reducing embodied carbon and waste production in the building lifecycle. This study aims to establish a framework for assessing embodied carbon and waste underpinned by BIM technology. On the basis of current research review, the framework is considered to include functional modules for embodied carbon computation. There are a module for waste estimation, a knowledge-base of construction and demolition methods, a repository of building components information, and an inventory of construction materials’ energy and carbon. Through both static 3D model visualisation and dynamic modelling supported by the framework, embodied energy (carbon), waste and associated costs can be analysed in the boundary of cradle-to-gate, construction, operation, and demolition. The proposed holistic modelling framework provides a possibility to analyse embodied carbon and waste from different building lifecycle perspectives including associated costs. It brings together existing segmented embodied carbon and waste estimation into a unified model, so that interactions between various parameters through the different building lifecycle phases can be better understood. Thus, it can improve design-decision support for optimal low impact building development. The applicability of this framework is anticipated being developed and tested on industrial projects in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The stratospheric climate and variability from simulations of sixteen chemistry‐climate models is evaluated. On average the polar night jet is well reproduced though its variability is less well reproduced with a large spread between models. Polar temperature biases are less than 5 K except in the Southern Hemisphere (SH) lower stratosphere in spring. The accumulated area of low temperatures responsible for polar stratospheric cloud formation is accurately reproduced for the Antarctic but underestimated for the Arctic. The shape and position of the polar vortex is well simulated, as is the tropical upwelling in the lower stratosphere. There is a wide model spread in the frequency of major sudden stratospheric warnings (SSWs), late biases in the breakup of the SH vortex, and a weak annual cycle in the zonal wind in the tropical upper stratosphere. Quantitatively, “metrics” indicate a wide spread in model performance for most diagnostics with systematic biases in many, and poorer performance in the SH than in the Northern Hemisphere (NH). Correlations were found in the SH between errors in the final warming, polar temperatures, the leading mode of variability, and jet strength, and in the NH between errors in polar temperatures, frequency of major SSWs, and jet strength. Models with a stronger QBO have stronger tropical upwelling and a colder NH vortex. Both the qualitative and quantitative analysis indicate a number of common and long‐standing model problems, particularly related to the simulation of the SH and stratospheric variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The orthodox approach for incentivising Demand Side Participation (DSP) programs is that utility losses from capital, installation and planning costs should be recovered under financial incentive mechanisms which aim to ensure that utilities have the right incentives to implement DSP activities. The recent national smart metering roll-out in the UK implies that this approach needs to be reassessed since utilities will recover the capital costs associated with DSP technology through bills. This paper introduces a reward and penalty mechanism focusing on residential users. DSP planning costs are recovered through payments from those consumers who do not react to peak signals. Those consumers who do react are rewarded by paying lower bills. Because real-time incentives to residential consumers tend to fail due to the negligible amounts associated with net gains (and losses) or individual users, in the proposed mechanism the regulator determines benchmarks which are matched against responses to signals and caps the level of rewards/penalties to avoid market distortions. The paper presents an overview of existing financial incentive mechanisms for DSP; introduces the reward/penalty mechanism aimed at fostering DSP under the hypothesis of smart metering roll-out; considers the costs faced by utilities for DSP programs; assesses linear rate effects and value changes; introduces compensatory weights for those consumers who have physical or financial impediments; and shows findings based on simulation runs on three discrete levels of elasticity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following the US model, the UK has seen considerable innovation in the funding, finance and procurement of real estate in the last decade. In the growing CMBS market asset backed securitisations have included $2.25billion secured on the Broadgate office development and issues secured on Canary Wharf and the Trafford Centre regional mall. Major occupiers (retailer Sainsbury’s, retail bank Abbey National) have engaged in innovative sale & leaseback and outsourcing schemes. Strong claims are made concerning the benefits of such schemes – e.g. British Land were reported to have reduced their weighted cost of debt by 150bp as a result of the Broadgate issue. The paper reports preliminary findings from a project funded by the Corporation of London and the RICS Research Foundation examining a number of innovative schemes to identify, within a formal finance framework, sources of added value and hidden costs. The analysis indicates that many of the gains claimed conceal costs – in terms of market value of debt or flexibility of management – while others result from unusual firm or market conditions (for example utilising the UK long lease and the unusual shape of the yield curve). Nonetheless, there are real gains resulting from the innovations, reflecting arbitrage and institutional constraints in the direct (private) real estate market

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ice cloud representation in general circulation models remains a challenging task, due to the lack of accurate observations and the complexity of microphysical processes. In this article, we evaluate the ice water content (IWC) and ice cloud fraction statistical distributions from the numerical weather prediction models of the European Centre for Medium-Range Weather Forecasts (ECMWF) and the UK Met Office, exploiting the synergy between the CloudSat radar and CALIPSO lidar. Using the last three weeks of July 2006, we analyse the global ice cloud occurrence as a function of temperature and latitude and show that the models capture the main geographical and temperature-dependent distributions, but overestimate the ice cloud occurrence in the Tropics in the temperature range from −60 °C to −20 °C and in the Antarctic for temperatures higher than −20 °C, but underestimate ice cloud occurrence at very low temperatures. A global statistical comparison of the occurrence of grid-box mean IWC at different temperatures shows that both the mean and range of IWC increases with increasing temperature. Globally, the models capture most of the IWC variability in the temperature range between −60 °C and −5 °C, and also reproduce the observed latitudinal dependencies in the IWC distribution due to different meteorological regimes. Two versions of the ECMWF model are assessed. The recent operational version with a diagnostic representation of precipitating snow and mixed-phase ice cloud fails to represent the IWC distribution in the −20 °C to 0 °C range, but a new version with prognostic variables for liquid water, ice and snow is much closer to the observed distribution. The comparison of models and observations provides a much-needed analysis of the vertical distribution of IWC across the globe, highlighting the ability of the models to reproduce much of the observed variability as well as the deficiencies where further improvements are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditionally, siting and sizing decisions for parks and reserves reflected ecological characteristics but typically failed to consider ecological costs created from displaced resource collection, welfare costs on nearby rural people, and enforcement costs. Using a spatial game-theoretic model that incorporates the interaction of socioeconomic and ecological settings, we show how incorporating more recent mandates that include rural welfare and surrounding landscapes can result in very different optimal sizing decisions. The model informs our discussion of recent forest management in Tanzania, reserve sizing and siting decisions, estimating reserve effectiveness, and determining patterns of avoided forest degradation in Reduced Emissions from Deforestation and Forest Degradation programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Child social anxiety is common, and predicts later emotional and academic impairment. Offspring of socially anxious mothers are at increased risk. It is important to establish whether individual vulnerability to disorder can be identified in young children. Method: The responses of 4.5 year-old children of mothers with social phobia (N = 62) and non-anxious mothers (N = 60) were compared, two months before school entry, using a Doll Play (DP) procedure focused on the social challenge of starting school. DP responses were examined in relation to teacher reports of anxious-depressed symptoms and social worries at the end of the child’s first school term. The role of earlier child behavioral inhibition and attachment, assessed at 14 months, was also considered. Results: Compared to children of non-anxious mothers, children of mothers with social phobia were significantly more likely to give anxiously negative responses in their school DP (OR = 2.57). In turn, negative DP predicted teacher reported anxious-depressed and social worry problems. There were no effects of infant behavioral inhibition or attachment. Conclusion: Vulnerability in young children at risk of anxiety can be identified using Doll Play narratives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Mitigation Options for Phosphorus and Sediment (MOPS) project investigated the effectiveness of within-field control measures (tramline management, straw residue management, type of cultivation and direction, and vegetative buffers) in terms of mitigating sediment and phosphorus loss from winter-sown combinable cereal crops using three case study sites. To determine the cost of the approaches, simple financial spreadsheet models were constructed at both farm and regional levels. Taking into account crop areas, crop rotation margins per hectare were calculated to reflect the costs of crop establishment, fertiliser and agro-chemical applications, harvesting, and the associated labour and machinery costs. Variable and operating costs associated with each mitigation option were then incorporated to demonstrate the impact on the relevant crop enterprise and crop rotation margins. These costs were then compared to runoff, sediment and phosphorus loss data obtained from monitoring hillslope-length scale field plots. Each of the mitigation options explored in this study had potential for reducing sediment and phosphorus losses from arable land under cereal crops. Sediment losses were reduced from between 9 kg ha−1 to as much as 4780 kg ha−1 with a corresponding reduction in phosphorus loss from 0.03 kg ha−1 to 2.89 kg ha−1. In percentage terms reductions of phosphorus were between 9% and 99%. Impacts on crop rotation margins also varied. Minimum tillage resulted in cost savings (up to £50 ha−1) whilst other options showed increased costs (up to £19 ha−1 for straw residue incorporation). Overall, the results indicate that each of the options has potential for on-farm implementation. However, tramline management appeared to have the greatest potential for reducing runoff, sediment, and phosphorus losses from arable land (between 69% and 99%) and is likely to be considered cost-effective with only a small additional cost of £2–4 ha−1, although further work is needed to evaluate alternative tramline management methods. Tramline management is also the only option not incorporated within current policy mechanisms associated with reducing soil erosion and phosphorus loss and in light of its potential is an approach that should be encouraged once further evidence is available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Escherichia coli, the most common cause of bacteraemia in humans in the UK, can also cause serious diseases in animals. However the population structure, virulence and antimicrobial resistance genes of those from extraintestinal organs of livestock animals are poorly characterised. The aims of this study were to investigate the diversity of these isolates from livestock animals and to understand if there was any correlation between the virulence and antimicrobial resistance genes and the genetic backbone of the bacteria and if these isolates were similar to those isolated from humans. Here 39 E. coli isolates from liver (n=31), spleen (n=5) and blood (n=3) of cattle (n=34), sheep (n=3), chicken (n=1) and pig (n=1) were assigned to 19 serogroups with O8 being the most common (n=7), followed by O101, O20 (both n=3) and O153 (n=2). They belong to 29 multi-locus sequence types, 20 clonal complexes with ST23 (n=7), ST10 (n=6), ST117 and ST155 (both n=3) being most common and were distributed among phylogenetic group A (n=16), B1 (n=12), B2 (n=2) and D (n=9). The pattern of a subset of putative virulence genes was different in almost all isolates. No correlation between serogroups, animal hosts, MLST types, virulence and antimicrobial resistance genes was identified. The distributions of clonal complexes and virulence genes were similar to other extraintestinal or commensal E. coli from humans and other animals, suggesting a zoonotic potential. The diverse and various combinations of virulence genes implied that the infections were caused by different mechanisms and infection control will be challenging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, publicinvolvement is necessary in the decision-making process (WCD, 2000). As financially-significant actors in the planning and implementation of large-scale hydropowerprojects in developing country contexts, the paper examines the ways in which publicinvolvement may be influenced by international financial institutions. Using the casestudy of the NamTheun2HydropowerProject in Laos, the paper analyses how publicinvolvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of publicparticipation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the projectappraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the projectappraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs.