79 resultados para Buy and hold -menetelmä
Resumo:
The problem of spurious excitation of gravity waves in the context of four-dimensional data assimilation is investigated using a simple model of balanced dynamics. The model admits a chaotic vortical mode coupled to a comparatively fast gravity wave mode, and can be initialized such that the model evolves on a so-called slow manifold, where the fast motion is suppressed. Identical twin assimilation experiments are performed, comparing the extended and ensemble Kalman filters (EKF and EnKF, respectively). The EKF uses a tangent linear model (TLM) to estimate the evolution of forecast error statistics in time, whereas the EnKF uses the statistics of an ensemble of nonlinear model integrations. Specifically, the case is examined where the true state is balanced, but observation errors project onto all degrees of freedom, including the fast modes. It is shown that the EKF and EnKF will assimilate observations in a balanced way only if certain assumptions hold, and that, outside of ideal cases (i.e., with very frequent observations), dynamical balance can easily be lost in the assimilation. For the EKF, the repeated adjustment of the covariances by the assimilation of observations can easily unbalance the TLM, and destroy the assumptions on which balanced assimilation rests. It is shown that an important factor is the choice of initial forecast error covariance matrix. A balance-constrained EKF is described and compared to the standard EKF, and shown to offer significant improvement for observation frequencies where balance in the standard EKF is lost. The EnKF is advantageous in that balance in the error covariances relies only on a balanced forecast ensemble, and that the analysis step is an ensemble-mean operation. Numerical experiments show that the EnKF may be preferable to the EKF in terms of balance, though its validity is limited by ensemble size. It is also found that overobserving can lead to a more unbalanced forecast ensemble and thus to an unbalanced analysis.
Resumo:
The absorption spectra of phytoplankton in the visible domain hold implicit information on the phytoplankton community structure. Here we use this information to retrieve quantitative information on phytoplankton size structure by developing a novel method to compute the exponent of an assumed power-law for their particle-size spectrum. This quantity, in combination with total chlorophyll-a concentration, can be used to estimate the fractional concentration of chlorophyll in any arbitrarily-defined size class of phytoplankton. We further define and derive expressions for two distinct measures of cell size of mixed populations, namely, the average spherical diameter of a bio-optically equivalent homogeneous population of cells of equal size, and the average equivalent spherical diameter of a population of cells that follow a power-law particle-size distribution. The method relies on measurements of two quantities of a phytoplankton sample: the concentration of chlorophyll-a, which is an operational index of phytoplankton biomass, and the total absorption coefficient of phytoplankton in the red peak of visible spectrum at 676 nm. A sensitivity analysis confirms that the relative errors in the estimates of the exponent of particle size spectra are reasonably low. The exponents of phytoplankton size spectra, estimated for a large set of in situ data from a variety of oceanic environments (~ 2400 samples), are within a reasonable range; and the estimated fractions of chlorophyll in pico-, nano- and micro-phytoplankton are generally consistent with those obtained by an independent, indirect method based on diagnostic pigments determined using high-performance liquid chromatography. The estimates of cell size for in situ samples dominated by different phytoplankton types (diatoms, prymnesiophytes, Prochlorococcus, other cyanobacteria and green algae) yield nominal sizes consistent with the taxonomic classification. To estimate the same quantities from satellite-derived ocean-colour data, we combine our method with algorithms for obtaining inherent optical properties from remote sensing. The spatial distribution of the size-spectrum exponent and the chlorophyll fractions of pico-, nano- and micro-phytoplankton estimated from satellite remote sensing are in agreement with the current understanding of the biogeography of phytoplankton functional types in the global oceans. This study contributes to our understanding of the distribution and time evolution of phytoplankton size structure in the global oceans.
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.
Resumo:
The present study addressed the hypothesis that emotional stimuli relevant to survival or reproduction (biologically emotional stimuli) automatically affect cognitive processing (e.g., attention, memory), while those relevant to social life (socially emotional stimuli) require elaborative processing to modulate attention and memory. Results of our behavioral studies showed that (1) biologically emotional images hold attention more strongly than do socially emotional images, (2) memory for biologically emotional images was enhanced even with limited cognitive resources, but (3) memory for socially emotional images was enhanced only when people had sufficient cognitive resources at encoding. Neither images’ subjective arousal nor their valence modulated these patterns. A subsequent functional magnetic resonance imaging study revealed that biologically emotional images induced stronger activity in the visual cortex and greater functional connectivity between the amygdala and visual cortex than did socially emotional images. These results suggest that the interconnection between the amygdala and visual cortex supports enhanced attention allocation to biological stimuli. In contrast, socially emotional images evoked greater activity in the medial prefrontal cortex (MPFC) and yielded stronger functional connectivity between the amygdala and MPFC than did biological images. Thus, it appears that emotional processing of social stimuli involves elaborative processing requiring frontal lobe activity.
Resumo:
This chapter explores the politics around the role of agency in the UK climate change debate. Government interventions on the demand side of consumption have increasingly involved attempts to obtain greater traction with the values, attitudes and beliefs of citizens in relation to climate change and also in terms of influencing consumer behaviour at an individual level. With figures showing that approximately 40% of the UK’s carbon emissions are attributable to household and transport behaviour, policy initiatives have progressively focused on the facilitation of “sustainable behaviours”. Evidence suggests however, that mobilisation of pro-environmental attitudes in addressing the perceived “value-action gap” has so far had limited success. Research in this field suggests that there is a more significant and nuanced “gap” between context and behaviour; a relationship that perhaps provides a more adroit reflection of reasons why people do not necessarily react in the way that policy-makers anticipate. Tracing the development of the UK Government’s behaviour change agenda over the last decade, we posit that a core reason for the limitations of this programme relates to an excessively narrow focus on the individual. This has served to obscure some of the wider political and economic aspects of the debate in favour of a more simplified discussion. The second part of the chapter reports findings from a series of focus groups exploring some of the wider political views that people hold around household energy habits, purchase and use of domestic appliances, and transport behaviour-and discusses these insights in relation to the literature on the agenda’s apparent limitations. The chapter concludes by considering whether the aims of the Big Society approach (recently established by the UK’s Coalition Government) hold the potential to engage more directly with some of these issues or whether they merely constitute a “repackaging” of the individualism agenda.
Resumo:
Recent research into sea ice friction has focussed on ways to provide a model which maintains much of the clarity and simplicity of Amonton's law, yet also accounts for memory effects. One promising avenue of research has been to adapt the rate- and state- dependent models which are prevalent in rock friction. In such models it is assumed that there is some fixed critical slip displacement, which is effectively a measure of the displacement over which memory effects might be considered important. Here we show experimentally that a fixed critical slip displacement is not a valid assumption in ice friction, whereas a constant critical slip time appears to hold across a range of parameters and scales. As a simple rule of thumb, memory effects persist to a significant level for 10 s. We then discuss the implications of this finding for modelling sea ice friction and for our understanding of friction in general.
Resumo:
What are the main causes of international terrorism? Despite the meticulous examination of various candidate explanations, existing estimates still diverge in sign, size, and significance. This article puts forward a novel explanation and supporting evidence. We argue that domestic political instability provides the learning environment needed to successfully execute international terror attacks. Using a yearly panel of 123 countries over 1973–2003, we find that the occurrence of civil wars increases fatalities and the number of international terrorist acts by 45%. These results hold for alternative indicators of political instability, estimators, subsamples, subperiods, and accounting for competing explanations.
Resumo:
This research report was commissioned by the DETR and examines valuation issues relating to leasehold enfanchisement and lease extension - the right for flat owners to collectively purchase the freehold or buy a longer lease. The two factors examined examined in detail are the yield to be applied when capitalising the ground rent and the relative value of leases with a relatively short period left to run as against the value of the freehold or a new long lease, which determines the level of 'marriage level'. The research report will be of interest to all those involved in the valuation of residential leasehold property and those with an interest in legislative proposals for leasehold reform.
Resumo:
It is increasingly important to know about when energy is used in the home, at work and on the move. Issues of time and timing have not featured strongly in energy policy analysis and in modelling, much of which has focused on estimating and reducing total average annual demand per capita. If smarter ways of balancing supply and demand are to take hold, and if we are to make better use of decarbonised forms of supply, it is essential to understand and intervene in patterns of societal synchronisation. This calls for detailed knowledge of when, and on what occasions many people engage in the same activities at the same time, of how such patterns are changing, and of how might they be shaped. In addition, the impact of smart meters and controls partly depends on whether there is, in fact scope for shifting the timing of what people do, and for changing the rhythm of the day. Is the scheduling of daily life an arena that policy can influence, and if so how? The DEMAND Centre has been linking time use, energy consumption and travel diary data as a means of addressing these questions and in this working paper we present some of the issues and results arising from that exercise.
Resumo:
Much UK research and market practice on portfolio strategy and performance benchmarking relies on a sector‐geography subdivision of properties. Prior tests of the appropriateness of such divisions have generally relied on aggregated or hypothetical return data. However, the results found in aggregate may not hold when individual buildings are considered. This paper makes use of a dataset of individual UK property returns. A series of multivariate exploratory statistical techniques are utilised to test whether the return behaviour of individual properties conforms to their a priori grouping. The results suggest strongly that neither standard sector nor regional classifications provide a clear demarcation of individual building performance. This has important implications for both portfolio strategy and performance measurement and benchmarking. However, there do appear to be size and yield effects that help explain return behaviour at the property level.
Resumo:
In England at both strategic and operational levels, policy-makers in the public sector have undertaken considerable work on implementing the findings of the Every Child Matters report and subsequently through the Children's Act 2004. Legislation has resulted in many local authorities seeking to implement more holistic approaches to the delivery of children's services. At a strategic level this is demonstrated by the creation of integrated directorate structures providing for a range of services, from education to children's social care. Such services were generally under the management of the Director of Children's Services, holding statutory responsibilities for the delivery of services formally divided into the three sectors of education, health and social services. At a national level, more fundamental policy developments have sought to establish a framework through which policy-makers can address the underlying causes of deprivation, vulnerability and inequality. The Child Poverty Act, 2010, which gained Royal Assent in 2010, provides for a clear intention to reduce the number of children in poverty, acknowledging that ‘the best way to eradicate child poverty is to address the causes of poverty, rather than only treat the symptoms’. However, whilst the policy objectives of both pieces of legislation hold positive aspirations for children and young people, a change of policy direction through a change of government in May 2010 seems to be in direct contrast to the intended focus of these aims. This paper explores the impact of new government policy on the future direction of children's services both at the national and local levels. At the national level, we question the ability of the government to deliver the aspirations of the Child Poverty Act, 2010, given the broad range of influences and factors that can determine the circumstances in which a child may experience poverty. We argue that poverty is not simply an issue of the pressure of financial deprivation, but that economic recession and cuts in government spending will further increase the number of children living in poverty.
Resumo:
Background Event-related desynchronization/synchronization (ERD/ERS) is a relative power decrease/increase of electroencephalogram (EEG) in a specific frequency band during physical motor execution and mental motor imagery, thus it is widely used for the brain-computer interface (BCI) purpose. However what the ERD really reflects and its frequency band specific role have not been agreed and are under investigation. Understanding the underlying mechanism which causes a significant ERD would be crucial to improve the reliability of the ERD-based BCI. We systematically investigated the relationship between conditions of actual repetitive hand movements and resulting ERD. Methods Eleven healthy young participants were asked to close/open their right hand repetitively at three different speeds (Hold, 1/3 Hz, and 1 Hz) and four distinct motor loads (0, 2, 10, and 15 kgf). In each condition, participants repeated 20 experimental trials, each of which consisted of rest (8–10 s), preparation (1 s) and task (6 s) periods. Under the Hold condition, participants were instructed to keep clenching their hand (i.e., isometric contraction) during the task period. Throughout the experiment, EEG signals were recorded from left and right motor areas for offline data analysis. We obtained time courses of EEG power spectrum to discuss the modulation of mu and beta-ERD/ERS due to the task conditions. Results We confirmed salient mu-ERD (8–13 Hz) and slightly weak beta-ERD (14–30 Hz) on both hemispheres during repetitive hand grasping movements. According to a 3 × 4 ANOVA (speed × motor load), both mu and beta-ERD during the task period were significantly weakened under the Hold condition, whereas no significant difference in the kinetics levels and interaction effect was observed. Conclusions This study investigates the effect of changes in kinematics and kinetics on resulting ERD during repetitive hand grasping movements. The experimental results suggest that the strength of ERD may reflect the time differentiation of hand postures in motor planning process or the variation of proprioception resulting from hand movements, rather than the motor command generated in the down stream, which recruits a group of motor neurons.
Resumo:
Global warming has attracted attention from all over the world and led to the concern about carbon emission. Kyoto Protocol, as the first major international regulatory emission trading scheme, was introduced in 1997 and outlined the strategies for reducing carbon emission (Ratnatunga et al., 2011). As the increased interest in carbon reduction the Protocol came into force in 2005, currently there are already 191 nations ratifying the Protocol(UNFCCC, 2012). Under the cap-and-trade schemes, each company has its carbon emission target. When company’s carbon emission exceeds the target the company will either face fines or buy emission allowance from other companies. Thus unlike most of the other social and environmental issues carbon emission could trigger cost for companies in introducing low-emission equipment and systems and also emission allowance cost when they emit more than their targets. Despite the importance of carbon emission to companies, carbon emission reporting is still operating under unregulated environment and companies are only required to disclose when it is material either in value or in substances (Miller, 2005, Deegan and Rankin, 1997). Even though there is still an increase in the volume of carbon emission disclosures in company’s financial reports and stand-alone social and environmental reports to show their concern of the environment and also their social responsibility (Peters and Romi, 2009), the motivations behind corporate carbon emission disclosures and whether carbon disclosures have impact on corporate environmental reputation and financial performance have not yet to explore. The problems with carbon emission lie on both the financial side and non-financial side of corporate governance. On one hand corporate needs to spend money in reducing carbon emission or paying penalties when they emit more than allowed. On the other hand as the public are more interested in environmental issues than before carbon emission could also impact on the image of corporate regarding to its environmental performance. The importance of carbon emission issue are beginning to be recognized by companies from different industries as one of the critical issues in supply chain management (Lee, 2011) and 80% of companies analysed are facing carbon risks resulting from emissions in the companies’ supply chain as shown in a study conducted by the Investor Responsibility Research Centre Institute for Corporate Responsibility (IRRCI) and over 80% of the companies analysed found that the majority of greenhouse gas (GHG) emission are from electricity and other direct suppliers (Trucost, 2009). The review of extant literature shows the increased importance of carbon emission issues and the gap in the study of carbon reporting and disclosures and also the study which links corporate environmental reputation and corporate financial performance with carbon reporting (Lohmann, 2009a, Ratnatunga and Balachandran, 2009, Bebbington and Larrinaga-Gonzalez, 2008). This study would focus on investigating the current status of UK carbon emission disclosures, the determinant factors of corporate carbon disclosure, and the relationship between carbon emission disclosures and corporate environmental reputation and financial performance of UK listed companies from 2004-2012 and explore the explanatory power of classical disclosure theories.
Resumo:
The question of where to locate teaching about the relationships between science and religion has produced a long-running debate. Currently, Science and Religious Education (RE) are statutory subjects in England and are taught in secondary schools by different teachers. This paper reports on an interview study in which 16 teachers gave their perceptions of their roles and responsibilities when teaching topics that bridge science and religion and the extent to which they collaborated with teachers in the other subject area. We found that in this sample, teachers reported very little collaboration between the curriculum areas. Although the science curriculum makes no mention of religion, all the science teachers said that their approaches to such topics were affected by their recognition that some pupils hold religious beliefs. All the RE teachers reported struggling to ensure students know of a range of views about how science and religion relate. The paper concludes with a discussion about implications for curriculum design and teacher training.
Resumo:
This paper provides an overview of interpolation of Banach and Hilbert spaces, with a focus on establishing when equivalence of norms is in fact equality of norms in the key results of the theory. (In brief, our conclusion for the Hilbert space case is that, with the right normalisations, all the key results hold with equality of norms.) In the final section we apply the Hilbert space results to the Sobolev spaces Hs(Ω) and tildeHs(Ω), for s in R and an open Ω in R^n. We exhibit examples in one and two dimensions of sets Ω for which these scales of Sobolev spaces are not interpolation scales. In the cases when they are interpolation scales (in particular, if Ω is Lipschitz) we exhibit examples that show that, in general, the interpolation norm does not coincide with the intrinsic Sobolev norm and, in fact, the ratio of these two norms can be arbitrarily large.