158 resultados para Cost projections
Resumo:
This article assesses the extent to which it is ‘fair’ for the government to require owner-occupiers to draw on the equity accumulated in their home to fund their social care costs. The question is stimulated by the report of the Commission on Funding of Care and Support, Fairer Care Funding (the Dilnot Commission) and the subsequent Care Act 2014. The enquiry is located within the framework of social citizenship and the new social contract. It argues that the individualistic, contractarian approach, exemplified by the Dilnot Commission and reflected in the Act, raises questions when considered from the perspective of intergenerational fairness. We argue that our concerns with the Act could be addressed by inculcating an expectation of drawing on housing wealth to fund older age: a policy of asset-based welfare.
Resumo:
In September 2013, the 5th Assessment Report (5AR) of the International Panel on Climate Change (IPCC) has been released. Taking the 5AR cli-mate change scenarios into account, the World Bank published an earli-er report on climate change and its impacts on selected hot spot re-gions, including Southeast Asia. Currently, dynamical and statistical-dynamical downscaling efforts are underway to obtain higher resolution and more robust regional climate change projections for tropical South-east Asia, including Vietnam. Such initiatives are formalized under the World Meteorological Organization (WMO) Coordinated Regional Dynamic Downscaling Experiment (CORDEX) East Asia and Southeast Asia and also take place in climate change impact projects such as the joint Vietnam-ese-German project “Environmental and Water Protection Technologies of Coastal Zones in Vietnam (EWATEC-COAST)”. In this contribution, the lat-est assessments for changes in temperature, precipitation, sea level, and tropical cyclones (TCs) under the 5AR Representative Concentration Pathway (RCP) scenarios 4.5 and 8.5 are reviewed. Special emphasis is put on changes in extreme events like heat waves and/or heavy precipita-tion. A regional focus is Vietnam south of 16°N. A continued increase in mean near surface temperature is projected, reaching up to 5°C at the end of this century in northern Vietnam un-der the high greenhouse-gas forcing scenario RCP8.5. Overall, project-ed changes in annual precipitation are small, but there is a tendency of more rainfall in the boreal winter dry season. Unprecedented heat waves and an increase in extreme precipitation events are projected by both global and regional climate models. Globally, TCs are projected to decrease in number, but an increase in intensity of peak winds and rain-fall in the inner core region is estimated. Though an assessment of changes in land-falling frequency in Vietnam is uncertain due to difficul-ties in assessing changes in TC tracks, some work indicates a reduction in the number of land-falling TCs in Vietnam. Sea level may rise by 75-100 cm until the end of the century with the Vietnamese coastline experienc-ing 10-15% higher rise than on global average. Given the large rice and aquaculture production in the Mekong and Red River Deltas, that are both prone to TC-related storm surges and flooding, this poses a challenge to foodsecurity and protection of coastal population and assets.
Resumo:
This review provides an overview of the main scientific outputs of a network (Action) supported by the European Cooperation in Science and Technology (COST) in the field of animal science, namely the COST Action Feed for Health (FA0802). The main aims of the COST Action Feed for Health (FA0802) were: to develop an integrated and collaborative network of research groups that focuses on the roles of feed and animal nutrition in improving animal wellbeing and also the quality, safety and wholesomeness of human foods of animal origin; to examine the consumer concerns and perceptions as regards livestock production systems. The COST Action Feed for Health has addressed these scientific topics during the last four years. From a practical point of view three main scientific fields of achievement can be identified: feed and animal nutrition; food of animal origin quality and functionality and consumers’ perceptions. Finally, the present paper has the scope to provide new ideas and solutions to a range of issues associated with the modern livestock production system.
Resumo:
A virtual system that emulates an ARM-based processor machine has been created to replace a traditional hardware-based system for teaching assembly language. The proposed virtual system integrates, in a single environment, all the development tools necessary to deliver introductory or advanced courses on modern assembly language programming. The virtual system runs a Linux operating system in either a graphical or console mode on a Windows or Linux host machine. No software licenses or extra hardware are required to use the virtual system, thus students are free to carry their own ARM emulator with them on a USB memory stick. Institutions adopting this, or a similar virtual system, can also benefit by reducing capital investment in hardware-based development kits and enable distance learning courses.
Resumo:
Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.
Resumo:
A low cost, compact embedded design approach for actuating soft robots is presented. The complete fabrication procedure and mode of operation was demonstrated, and the performance of the complete system was also demonstrated by building a microcontroller based hardware system which was used to actuate a soft robot for bending motion. The actuation system including the electronic circuit board and actuation components was embedded in a 3D-printed casing to ensure a compact approach for actuating soft robots. Results show the viability of the system in actuating and controlling siliconebased soft robots to achieve bending motions. Qualitative measurements of uniaxial tensile test, bending distance and pressure were obtained. This electronic design is easy to reproduce and integrate into any specified soft robotic device requiring pneumatic actuation.
Resumo:
Virtual Reality (VR) can provide visual stimuli for EEG studies that can be altered in real time and can produce effects that are difficult or impossible to reproduce in a non-virtual experimental platform. As part of this experiment the Oculus Rift, a commercial-grade, low-cost, Head Mounted Display (HMD) was assessed as a visual stimuli platform for experiments recording EEG. Following, the device was used to investigate the effect of congruent visual stimuli on Event Related Desynchronisation (ERD) due to motion imagery.
Resumo:
Background: UK National Institute of Health and Clinical Excellence guidelines for obsessive compulsive disorder (OCD) specify recommendations for the treatment and management of OCD using a stepped care approach. Steps three to six of this model recommend treatment options for people with OCD that range from low-intensity guided self-help (GSH) to more intensive psychological and pharmacological interventions. Cognitive behavioural therapy (CBT), including exposure and response prevention, is the recommended psychological treatment. However, whilst there is some preliminary evidence that self-managed therapy packages for OCD can be effective, a more robust evidence base of their clinical and cost effectiveness and acceptability is required. Methods/Design: Our proposed study will test two different self-help treatments for OCD: 1) computerised CBT (cCBT) using OCFighter, an internet-delivered OCD treatment package; and 2) GSH using a book. Both treatments will be accompanied by email or telephone support from a mental health professional. We will evaluate the effectiveness, cost and patient and health professional acceptability of the treatments. Discussion: This study will provide more robust evidence of efficacy, cost effectiveness and acceptability of self-help treatments for OCD. If cCBT and/or GSH prove effective, it will provide additional, more accessible treatment options for people with OCD.
Resumo:
Over the past 30 years, cost–benefit analysis (CBA) has been applied to various areas of public policies and projects. The aim of this essay is to describe the origins of CBA, classify typologies of costs and benefits, define efficiency under CBA and discuss issues associated with the use of a microeconomic tool in macroeconomic contexts.
Resumo:
A FTC-DOJ study argues that state laws and regulations may inhibit the unbundling of real estate brokerage services in response to new technology. Our data show that 18 states have changed laws in ways that promote unbundling since 2000. We model brokerage costs as measured by number of agents in a state-level annual panel vector autoregressive framework, a novel way of analyzing wasteful competition. Our findings support a positive relationship between brokerage costs and lagged house price and transactions. We find that change in full-service brokers responds negatively (by well over two percentage points per year) to legal changes facilitating unbundling
Resumo:
We develop a transaction cost economics theory of the family firm, building upon the concepts of family-based asset specificity, bounded rationality, and bounded reliability. We argue that the prosperity and survival of family firms depend on the absence of a dysfunctional bifurcation bias. The bifurcation bias is an expression of bounded reliability, reflected in the de facto asymmetric treatment of family vs. nonfamily assets (especially human assets). We propose that absence of bifurcation bias is critical to fostering reliability in family business functioning. Our study ends the unproductive divide between the agency and stewardship perspectives of the family firm, which offer conflicting accounts of this firm type's functioning. We show that the predictions of the agency and stewardship perspectives can be usefully reconciled when focusing on how family firms address the bifurcation bias or fail to do so.
Resumo:
We provide a new legal perspective for the antitrust analysis of margin squeeze conducts. Building on recent economic analysis, we explain why margin squeeze conducts should solely be evaluated under adjusted predatory pricing standards. The adjustment corresponds to an increase in the cost benchmark used in the predatory pricing test by including opportunity costs due to missed upstream sales. This can reduce both the risks of false-positives and false-negatives in margin squeeze cases. We justify this approach by explaining why classic arguments against above-cost predatory pricing typically do not hold in vertical structures where margin squeezes take place and by presenting case law evidence supporting this adjustment. Our approach can help to reconcile the divergent US and EU antitrust stances on margin squeeze.
Resumo:
Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 Global Climate Models (GCMs) produce a wide range of simulated SIT in the historical period (1979–2014) and exhibit various biases when compared with the Pan-Arctic Ice Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the spread in projections of SIT and reveals the significant contributions of climate internal variability in the first half of the century and of scenario uncertainty from mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to reduce spread in climate projections more generally.
Resumo:
Model simulations of the next few decades are widely used in assessments of climate change impacts and as guidance for adaptation. Their non-linear nature reveals a level of irreducible uncertainty which it is important to understand and quantify, especially for projections of near-term regional climate. Here we use large idealised initial condition ensembles of the FAMOUS global climate model with a 1 %/year compound increase in CO2 levels to quantify the range of future temperatures in model-based projections. These simulations explore the role of both atmospheric and oceanic initial conditions and are the largest such ensembles to date. Short-term simulated trends in global temperature are diverse, and cooling periods are more likely to be followed by larger warming rates. The spatial pattern of near-term temperature change varies considerably, but the proportion of the surface showing a warming is more consistent. In addition, ensemble spread in inter-annual temperature declines as the climate warms, especially in the North Atlantic. Over Europe, atmospheric initial condition uncertainty can, for certain ocean initial conditions, lead to 20 year trends in winter and summer in which every location can exhibit either strong cooling or rapid warming. However, the details of the distribution are highly sensitive to the ocean initial condition chosen and particularly the state of the Atlantic meridional overturning circulation. On longer timescales, the warming signal becomes more clear and consistent amongst different initial condition ensembles. An ensemble using a range of different oceanic initial conditions produces a larger spread in temperature trends than ensembles using a single ocean initial condition for all lead times. This highlights the potential benefits from initialising climate predictions from ocean states informed by observations. These results suggest that climate projections need to be performed with many more ensemble members than at present, using a range of ocean initial conditions, if the uncertainty in near-term regional climate is to be adequately quantified.
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.