38 resultados para merits of mandatory reporting of neglect
Resumo:
We study the empirical performance of the classical minimum-variance hedging strategy, comparing several econometric models for estimating hedge ratios of crude oil, gasoline and heating oil crack spreads. Given the great variability and large jumps in both spot and futures prices, considerable care is required when processing the relevant data and accounting for the costs of maintaining and re-balancing the hedge position. We find that the variance reduction produced by all models is statistically and economically indistinguishable from the one-for-one “naïve” hedge. However, minimum-variance hedging models, especially those based on GARCH, generate much greater margin and transaction costs than the naïve hedge. Therefore we encourage hedgers to use a naïve hedging strategy on the crack spread bundles now offered by the exchange; this strategy is the cheapest and easiest to implement. Our conclusion contradicts the majority of the existing literature, which favours the implementation of GARCH-based hedging strategies.
Resumo:
We investigated the potential function of the system formed by connections between the medial prefrontal cortex and the dorsomedial striatum in aspects of attentional function in the rat. It has been reported previously that disconnection of the same corticostriatal circuit produced marked deficits in performance of a serial, choice reaction-time task while sparing the acquisition of an appetitive Pavlovian approach behaviour in an autoshaping task (Christakou et al., 2001). Here, we hypothesized that unilateral disruption of the same circuit would lead to hemispatial inattention, contrasting with the global attention deficit following complete disconnection of the system. Combined unilateral lesions of the medial prefrontal cortex (mPFC) and the medial caudate-putamen (mCPu) within the same hemisphere produced a severe and long-lasting contralesional neglect syndrome while sparing the acquisition of autoshaping. These results provide further evidence for the involvement of the medial prefrontal-dorsomedial striatal circuit in aspects of attentional function, as well as insight into the nature of neglect deficits following lesions at different levels within corticostriatal circuitry.
Resumo:
Modern health care rhetoric promotes choice and individual patient rights as dominant values. Yet we also accept that in any regime constrained by finite resources, difficult choices between patients are inevitable. How can we balance rights to liberty, on the one hand, with equity in the allocation of scarce resources on the other? For example, the duty of health authorities to allocate resources is a duty owed to the community as a whole, rather than to specific individuals. Macro-duties of this nature are founded on the notion of equity and fairness amongst individuals rather than personal liberty. They presume that if hard choices have to be made, they will be resolved according to fair and consistent principles which treat equal cases equally, and unequal cases unequally. In this paper, we argue for greater clarity and candour in the health care rights debate. With this in mind, we discuss (1) private and public rights, (2) negative and positive rights, (3) procedural and substantive rights, (4) sustainable health care rights and (5) the New Zealand booking system for prioritising access to elective services. This system aims to consider: individual need and ability to benefit alongside the resources made available to elective health services in an attempt to give the principles of equity practical effect. We describe a continuum on which the merits of those, sometimes competing, values-liberty and equity-can be evaluated and assessed.
Resumo:
The International System of Units (SI) is founded on seven base units, the metre, kilogram, second, ampere, kelvin, mole and candela corresponding to the seven base quantities of length, mass, time, electric current, thermodynamic temperature, amount of substance and luminous intensity. At its 94th meeting in October 2005, the International Committee for Weights and Measures (CIPM) adopted a recommendation on preparative steps towards redefining the kilogram, ampere, kelvin and mole so that these units are linked to exactly known values of fundamental constants. We propose here that these four base units should be given new definitions linking them to exactly defined values of the Planck constant h, elementary charge e, Boltzmann constant k and Avogadro constant NA, respectively. This would mean that six of the seven base units of the SI would be defined in terms of true invariants of nature. In addition, not only would these four fundamental constants have exactly defined values but also the uncertainties of many of the other fundamental constants of physics would be either eliminated or appreciably reduced. In this paper we present the background and discuss the merits of these proposed changes, and we also present possible wordings for the four new definitions. We also suggest a novel way to define the entire SI explicitly using such definitions without making any distinction between base units and derived units. We list a number of key points that should be addressed when the new definitions are adopted by the General Conference on Weights and Measures (CGPM), possibly by the 24th CGPM in 2011, and we discuss the implications of these changes for other aspects of metrology.
Resumo:
There has been considerable discussion about the merits of redefining four of the base units of the SI, including the mole. In this paper, the options for implementing a new definition for the mole based on a fixed value for the Avogadro constant are discussed. They are placed in the context of the macroscopic nature of the quantity amount of substance and the opportunity to introduce a system for molar and atomic masses with unchanged values and consistent relative uncertainties.
Resumo:
We have previously placed the solar contribution to recent global warming in context using observations and without recourse to climate models. It was shown that all solar forcings of climate have declined since 1987. The present paper extends that analysis to include the effects of the various time constants with which the Earth’s climate system might react to solar forcing. The solar input waveform over the past 100 years is defined using observed and inferred galactic cosmic ray fluxes, valid for either a direct effect of cosmic rays on climate or an effect via their known correlation with total solar irradiance (TSI), or for a combination of the two. The implications, and the relative merits, of the various TSI composite data series are discussed and independent tests reveal that the PMOD composite used in our previous paper is the most realistic. Use of the ACRIM composite, which shows a rise in TSI over recent decades, is shown to be inconsistent with most published evidence for solar influences on pre-industrial climate. The conclusions of our previous paper, that solar forcing has declined over the past 20 years while surface air temperatures have continued to rise, are shown to apply for the full range of potential time constants for the climate response to the variations in the solar forcings.
Resumo:
Rovibrational energy levels, transition frequencies, and linestrengths are computed variationally for the sulfur hydrides D2S and HDS, using ab initio potential energy and dipole surfaces. Wave-numbers for the pure rotational transitions agree to within 0.2 cm−1 of the experimental lines. For the fundamental vibrational transitions, the band origins for D2S are 860.4, 1900.6, and 1912.0 cm−1 for ν2, ν1, and ν3, respectively, compared with the corresponding experimental values of 855.4, 1896.4, and 1910.2 cm−1. For HDS, we compute ν2 to be 1039.4 cm−1, compared with the experimental value of 1032.7 cm−1. The relative merits of local and normal mode descriptions for the overtone stretching band origins are discussed. Our results confirm the local mode nature of the H2S, D2S, and HDS system.
Resumo:
Purpose: The purpose of this paper is to review the rationale for 'transdiagnostic' approaches to the understanding and treatment of anxiety disorders. Methods: Databases searches and examination of the reference lists of relevant studies were used to identify papers of relevance. Results: There is increasing recognition that diagnosis-specific interventions for single anxiety-disorders are of less value than might appear since a large proportion of patients have more than one co-existing anxiety disorder and the treatment of one anxiety disorder does not necessarily lead to the resolution of others. As transdiagnostic approaches have the potential to address multiple co-existing anxiety disorders they are potentially more clinically relevant than single anxiety disorder interventions. They may also have advantages in ease of dissemination and in treating anxiety disorder not otherwise specified. Conclusions: The merits of the various transdiagnostic cognitive-behavioral approaches that have been proposed are reviewed. Such approaches have potential benefits, particularly in striking the balance between completely idiosyncratic formulations and diagnosis-driven treatments of anxiety disorders. However, caution is needed to ensure that transdiagnostic theories and treatments benefit from progress made by research on diagnosis-specific treatments, and further empirical work is needed to identify the shared maintaining processes that need to be targeted in the treatment of anxiety disorders.
Resumo:
Retinal blurring resulting from the human eye's depth of focus has been shown to assist visual perception. Infinite focal depth within stereoscopically displayed virtual environments may cause undesirable effects, for instance, objects positioned at a distance in front of or behind the observer's fixation point will be perceived in sharp focus with large disparities thereby causing diplopia. Although published research on incorporation of synthetically generated Depth of Field (DoF) suggests that this might act as an enhancement to perceived image quality, no quantitative testimonies of perceptional performance gains exist. This may be due to the difficulty of dynamic generation of synthetic DoF where focal distance is actively linked to fixation distance. In this paper, such a system is described. A desktop stereographic display is used to project a virtual scene in which synthetically generated DoF is actively controlled from vergence-derived distance. A performance evaluation experiment on this system which involved subjects carrying out observations in a spatially complex virtual environment was undertaken. The virtual environment consisted of components interconnected by pipes on a distractive background. The subject was tasked with making an observation based on the connectivity of the components. The effects of focal depth variation in static and actively controlled focal distance conditions were investigated. The results and analysis are presented which show that performance gains may be achieved by addition of synthetic DoF. The merits of the application of synthetic DoF are discussed.
Resumo:
The integration of processes at different scales is a key problem in the modelling of cell populations. Owing to increased computational resources and the accumulation of data at the cellular and subcellular scales, the use of discrete, cell-level models, which are typically solved using numerical simulations, has become prominent. One of the merits of this approach is that important biological factors, such as cell heterogeneity and noise, can be easily incorporated. However, it can be difficult to efficiently draw generalizations from the simulation results, as, often, many simulation runs are required to investigate model behaviour in typically large parameter spaces. In some cases, discrete cell-level models can be coarse-grained, yielding continuum models whose analysis can lead to the development of insight into the underlying simulations. In this paper we apply such an approach to the case of a discrete model of cell dynamics in the intestinal crypt. An analysis of the resulting continuum model demonstrates that there is a limited region of parameter space within which steady-state (and hence biologically realistic) solutions exist. Continuum model predictions show good agreement with corresponding results from the underlying simulations and experimental data taken from murine intestinal crypts.
Resumo:
Food security is one of this century’s key global challenges. By 2050 the world will require increased crop production in order to feed its predicted 9 billion people. This must be done in the face of changing consumption patterns, the impacts of climate change and the growing scarcity of water and land. Crop production methods will also have to sustain the environment, preserve natural resources and support livelihoods of farmers and rural populations around the world. There is a pressing need for the ‘sustainable intensifi cation’ of global agriculture in which yields are increased without adverse environmental impact and without the cultivation of more land. Addressing the need to secure a food supply for the whole world requires an urgent international effort with a clear sense of long-term challenges and possibilities. Biological science, especially publicly funded science, must play a vital role in the sustainable intensifi cation of food crop production. The UK has a responsibility and the capacity to take a leading role in providing a range of scientifi c solutions to mitigate potential food shortages. This will require signifi cant funding of cross-disciplinary science for food security. The constraints on food crop production are well understood, but differ widely across regions. The availability of water and good soils are major limiting factors. Signifi cant losses in crop yields occur due to pests, diseases and weed competition. The effects of climate change will further exacerbate the stresses on crop plants, potentially leading to dramatic yield reductions. Maintaining and enhancing the diversity of crop genetic resources is vital to facilitate crop breeding and thereby enhance the resilience of food crop production. Addressing these constraints requires technologies and approaches that are underpinned by good science. Some of these technologies build on existing knowledge, while others are completely radical approaches, drawing on genomics and high-throughput analysis. Novel research methods have the potential to contribute to food crop production through both genetic improvement of crops and new crop and soil management practices. Genetic improvements to crops can occur through breeding or genetic modifi cation to introduce a range of desirable traits. The application of genetic methods has the potential to refi ne existing crops and provide incremental improvements. These methods also have the potential to introduce radical and highly signifi cant improvements to crops by increasing photosynthetic effi ciency, reducing the need for nitrogen or other fertilisers and unlocking some of the unrealised potential of crop genomes. The science of crop management and agricultural practice also needs to be given particular emphasis as part of a food security grand challenge. These approaches can address key constraints in existing crop varieties and can be applied widely. Current approaches to maximising production within agricultural systems are unsustainable; new methodologies that utilise all elements of the agricultural system are needed, including better soil management and enhancement and exploitation of populations of benefi cial soil microbes. Agronomy, soil science and agroecology—the relevant sciences—have been neglected in recent years. Past debates about the use of new technologies for agriculture have tended to adopt an either/or approach, emphasising the merits of particular agricultural systems or technological approaches and the downsides of others. This has been seen most obviously with respect to genetically modifi ed (GM) crops, the use of pesticides and the arguments for and against organic modes of production. These debates have failed to acknowledge that there is no technological panacea for the global challenge of sustainable and secure global food production. There will always be trade-offs and local complexities. This report considers both new crop varieties and appropriate agroecological crop and soil management practices and adopts an inclusive approach. No techniques or technologies should be ruled out. Global agriculture demands a diversity of approaches, specific to crops, localities, cultures and other circumstances. Such diversity demands that the breadth of relevant scientific enquiry is equally diverse, and that science needs to be combined with social, economic and political perspectives. In addition to supporting high-quality science, the UK needs to maintain and build its capacity to innovate, in collaboration with international and national research centres. UK scientists and agronomists have in the past played a leading role in disciplines relevant to agriculture, but training in agricultural sciences and related topics has recently suffered from a lack of policy attention and support. Agricultural extension services, connecting farmers with new innovations, have been similarly neglected in the UK and elsewhere. There is a major need to review the support for and provision of extension services, particularly in developing countries. The governance of innovation for agriculture needs to maximise opportunities for increasing production, while at the same time protecting societies, economies and the environment from negative side effects. Regulatory systems need to improve their assessment of benefits. Horizon scanning will ensure proactive consideration of technological options by governments. Assessment of benefi ts, risks and uncertainties should be seen broadly, and should include the wider impacts of new technologies and practices on economies and societies. Public and stakeholder dialogue—with NGOs, scientists and farmers in particular—needs to be a part of all governance frameworks.
Resumo:
The addition of small quantities of nanoparticles to conventional and sustainable thermoplastics leads to property enhancements with considerable potential in many areas of applications including food packaging 1, lightweight composites and high performance materials 2. In the case of sustainable polymers 3, the addition of nanoparticles may well sufficiently enhance properties such that the portfolio of possible applications is greatly increased. Most engineered nanoparticles are highly stable and these exist as nanoparticles prior to compounding with the polymer resin. They remain as nanoparticles during the active use of the packaging material as well as in the subsequent waste and recycling streams. It is also possible to construct the nanoparticles within the polymer films during processing from organic compounds selected to present minimal or no potential health hazards 4. In both cases the characterisation of the resultant nanostructured polymers presents a number of challenges. Foremost amongst these are the coupled challenges of the nanoscale of the particles and the low fraction present in the polymer matrix. Very low fractions of nanoparticles are only effective if the dispersion of the particles is good. This continues to be an issue in the process engineering but of course bad dispersion is much easier to see than good dispersion. In this presentation we show the merits of a combined scattering (neutron and x-ray) and microscopy (SEM, TEM, AFM) approach. We explore this methodology using rod like, plate like and spheroidal particles including metallic particles, plate-like and rod-like clay dispersions and nanoscale particles based on carbon such as nanotubes and graphene flakes. We will draw on a range of material systems, many explored in partnership with other members of Napolynet. The value of adding nanoscale particles is that the scale matches the scale of the structure in the polymer matrix. Although this can lead to difficulties in separating the effects in scattering experiments, the result in morphological studies means that both the nanoparticles and the polymer morphology are revealed.
Resumo:
This paper seeks to elucidate the fundamental differences between the nonconservation of potential temperature and that of Conservative Temperature, in order to better understand the relative merits of each quantity for use as the heat variable in numerical ocean models. The main result is that potential temperature is found to behave similarly to entropy, in the sense that its nonconservation primarily reflects production/destruction by surface heat and freshwater fluxes; in contrast, the nonconservation of Conservative Temperature is found to reflect primarily the overall compressible work of expansion/contraction. This paper then shows how this can be exploited to constrain the nonconservation of potential temperature and entropy from observed surface heat fluxes, and the nonconservation of Conservative Temperature from published estimates of the mechanical energy budgets of ocean numerical models. Finally, the paper shows how to modify the evolution equation for potential temperature so that it is exactly equivalent to using an exactly conservative evolution equation for Conservative Temperature, as was recently recommended by IOC et al. (2010). This result should in principle allow ocean modellers to test the equivalence between the two formulations, and to indirectly investigate to what extent the budget of derived nonconservative quantities such as buoyancy and entropy can be expected to be accurately represented in ocean models.
Resumo:
Lifestyle factors are responsible for a considerable portion of cancer incidence worldwide, but credible estimates from the World Health Organization and the International Agency for Research on Cancer (IARC) suggest that the fraction of cancers attributable to toxic environmental exposures is between 7% and 19%. To explore the hypothesis that low-dose exposures to mixtures of chemicals in the environment may be combining to contribute to environmental carcinogenesis, we reviewed 11 hallmark phenotypes of cancer, multiple priority target sites for disruption in each area and prototypical chemical disruptors for all targets, this included dose-response characterizations, evidence of low-dose effects and cross-hallmark effects for all targets and chemicals. In total, 85 examples of chemicals were reviewed for actions on key pathways/mechanisms related to carcinogenesis. Only 15% (13/85) were found to have evidence of a dose-response threshold, whereas 59% (50/85) exerted low-dose effects. No dose-response information was found for the remaining 26% (22/85). Our analysis suggests that the cumulative effects of individual (non-carcinogenic) chemicals acting on different pathways, and a variety of related systems, organs, tissues and cells could plausibly conspire to produce carcinogenic synergies. Additional basic research on carcinogenesis and research focused on low-dose effects of chemical mixtures needs to be rigorously pursued before the merits of this hypothesis can be further advanced. However, the structure of the World Health Organization International Programme on Chemical Safety 'Mode of Action' framework should be revisited as it has inherent weaknesses that are not fully aligned with our current understanding of cancer biology.