711 resultados para Localization real-world challenges
Resumo:
This paper examines the lead–lag relationship between the FTSE 100 index and index futures price employing a number of time series models. Using 10-min observations from June 1996–1997, it is found that lagged changes in the futures price can help to predict changes in the spot price. The best forecasting model is of the error correction type, allowing for the theoretical difference between spot and futures prices according to the cost of carry relationship. This predictive ability is in turn utilised to derive a trading strategy which is tested under real-world conditions to search for systematic profitable trading opportunities. It is revealed that although the model forecasts produce significantly higher returns than a passive benchmark, the model was unable to outperform the benchmark after allowing for transaction costs.
Resumo:
One of the most challenging tasks in financial management for large governmental and industrial organizations is Planning and Budgeting (P&B). The processes involved with P&B are cost and time intensive, especially when dealing with uncertainties and budget adjustments during the planning horizon. This work builds on our previous research in which we proposed and evaluated a fuzzy approach that allows optimizing the budget interactively beyond the initial planning stage. In this research we propose an extension that handles financial stress (i.e. drastic budget cuts) occurred during the budget period. This is done by introducing fuzzy stress parameters which are used to re-distribute the budget in order to minimize the negative impact of the financial stress. The benefits and possible issues of this approach are analyzed critically using a real world case study from the Nuremberg Institute of Technology (NIT). Additionally, ongoing and future research directions are presented.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
As the fidelity of virtual environments (VE) continues to increase, the possibility of using them as training platforms is becoming increasingly realistic for a variety of application domains, including military and emergency personnel training. In the past, there was much debate on whether the acquisition and subsequent transfer of spatial knowledge from VEs to the real world is possible, or whether the differences in medium during training would essentially be an obstacle to truly learning geometric space. In this paper, the authors present various cognitive and environmental factors that not only contribute to this process, but also interact with each other to a certain degree, leading to a variable exposure time requirement in order for the process of spatial knowledge acquisition (SKA) to occur. The cognitive factors that the authors discuss include a variety of individual user differences such as: knowledge and experience; cognitive gender differences; aptitude and spatial orientation skill; and finally, cognitive styles. Environmental factors discussed include: Size, Spatial layout complexity and landmark distribution. It may seem obvious that since every individual's brain is unique - not only through experience, but also through genetic predisposition that a one size fits all approach to training would be illogical. Furthermore, considering that various cognitive differences may further emerge when a certain stimulus is present (e.g. complex environmental space), it would make even more sense to understand how these factors can impact spatial memory, and to try to adapt the training session by providing visual/auditory cues as well as by changing the exposure time requirements for each individual. The impact of this research domain is important to VE training in general, however within service and military domains, guaranteeing appropriate spatial training is critical in order to ensure that disorientation does not occur in a life or death scenario.
Resumo:
The extent of the surface area sunlit is critical for radiative energy exchanges and therefore for a wide range of applications that require urban land surface models (ULSM), ranging from human comfort to weather forecasting. Here a computational demanding shadow casting algorithm is used to assess the capability of a simple single-layer urban canopy model, which assumes an infinitely long rotating canyon (ILC), to reproduce sunlit areas on roof and roads over central London. Results indicate that the sunlit roads areas are well-represented but somewhat smaller using an ILC, while sunlit roofs areas are consistently larger, especially for dense urban areas. The largest deviations from real world sunlit areas are found for roofs during mornings and evenings. Indications that sunlit fractions on walls are overestimated using an ILC during mornings and evenings are found. The implications of these errors are dependent on the application targeted. For example, (independent of albedo) ULSMs used in numerical weather prediction applying ILC representation of the urban form will overestimate outgoing shortwave radiation from roofs due to the overestimation of sunlit fraction of the roofs. Complications of deriving height to width ratios from real world data are also discussed.
Resumo:
Background Models of the development and maintenance of childhood anxiety suggest an important role for parent cognitions: that is, negative expectations of children's coping abilities lead to parenting behaviors that maintain child anxiety. The primary aims of the current study were to (1) compare expectations of child vulnerability and coping among mothers of children with anxiety disorders on the basis of whether or not mothers also had a current anxiety disorder, and (2) examine the degree to which the association between maternal anxiety disorder status and child coping expectations was mediated by how mothers interpreted ambiguous material that referred to their own experience. Methods The association between interpretations of threat, negative emotion, and control was assessed using hypothetical ambiguous scenarios in a sample of 271 anxious and nonanxious mothers of 7- to 12-year-old children with an anxiety disorder. Mothers also rated their expectations when presented with real life challenge tasks. Results There was a significant association between maternal anxiety disorder status and negative expectations of child coping behaviors. Mothers’ self-referent interpretations were found to mediate this relationship. Responses to ambiguous hypothetical scenarios correlated significantly with responses to real life challenge tasks. Conclusions Treatments for childhood anxiety disorders in the context of parental anxiety disorders may benefit from the inclusion of a component to directly address parental cognitions. Some inconsistencies were found when comparing maternal expectations in response to hypothetical scenarios with real life challenges. This should be addressed in future research.
Resumo:
It is widely accepted that there is a gap between design energy and real world operational energy consumption. The behaviour of occupants is often cited as an important factor influencing building energy performance. However, its consideration, both during design and operation, is overly simplistic, often assuming a direct link between attitudes and behaviour. Alternative models of decision making from psychology highlight a range of additional influential factors and emphasise that occupants do not always act in a rational manner. Developing a better understanding of occupant decision making could help inform office energy conservation campaigns as well as models of behaviour employed during the design process. This paper assesses the contribution of various behavioural constructs on small power consumption in offices. The method is based upon the Theory of Planned Behaviour (TPB) which assumes that intention is driven by three factors: attitude, subjective norms, and perceived behavioural control, but we also consider a fourth construct: habit measured through the Self- Report Habit Index (SRHI). A questionnaire was issued to 81 participants in two UK offices. Questionnaire results for each behavioural construct were correlated against each participant’s individual workstation electricity consumption. The intentional processes proposed by TPB could not account for the observed differences in occupants’ interactions with small power appliances. Instead, occupants were interacting with small power “automatically”, with habit accounting for 11% of the variation in workstation energy consumption. The implications for occupant behaviour models and employee engagement campaigns are discussed.
Resumo:
The article argues for a broader conception of bioethics. The principles that dominate current thinking are generally individualistic and do not represent the real world inhabited by patients, doctors, hospitals and the NHS as a whole. Rather than focus almost exclusively on the micro-end of the analytical spectrum, bioethics, and medical lawyers in particular, should take the institutional dimensions of health and health care more seriously, ie use a telescope to understand the dynamics that drive the subject, not just a microscope.
Resumo:
Presents an interview with Elizabeth Nunez, author and professor. Nunez discusses the issues on migration, family, and intimacy which are the topics of her novel "Anna In-Between." She explains the demands of the publishing industry that cast a shadow in the world of the novel and the real world of Caribbean writers. This interview was translated by Maria Lusia Ruiz.
Resumo:
This paper uses a novel numerical optimization technique - robust optimization - that is well suited to solving the asset-liability management (ALM) problem for pension schemes. It requires the estimation of fewer stochastic parameters, reduces estimation risk and adopts a prudent approach to asset allocation. This study is the first to apply it to a real-world pension scheme, and the first ALM model of a pension scheme to maximise the Sharpe ratio. We disaggregate pension liabilities into three components - active members, deferred members and pensioners, and transform the optimal asset allocation into the scheme’s projected contribution rate. The robust optimization model is extended to include liabilities and used to derive optimal investment policies for the Universities Superannuation Scheme (USS), benchmarked against the Sharpe and Tint, Bayes-Stein, and Black-Litterman models as well as the actual USS investment decisions. Over a 144 month out-of-sample period robust optimization is superior to the four benchmarks across 20 performance criteria, and has a remarkably stable asset allocation – essentially fix-mix. These conclusions are supported by six robustness checks.
Resumo:
Payment cards are a useful device to measure subjects’ preferences for a good and especially their willingness to pay for it. Together with some other similar elicitation methods, payment cards are especially appropriate for both hypothetical and incentive-compatible valuations of a good; a property which has prompted many researchers to use them in studies comparing stated and revealed valuations. The Strategy Method (hereafter SM) is a method based on a similar principle as that of payment cards, but is aimed at eliciting a subject’s full profile of responses to each of the strategies available to the rival(s).
Resumo:
The eye movements of 24 children and 24 adults were monitored to compare how they read sentences containing plausible, implausible, and anomalous thematic relations. In the implausible condition the incongruity occurred due to the incompatibility of two objects involved in the event denoted by the main verb. In the anomalous condition the direct object of the verb was not a possible verb argument. Adults exhibited immediate disruption with the anomalous sentences as compared to the implausible sentences as indexed by longer gaze durations on the target word. Children exhibited the same pattern of effects as adults as far as the anomalous sentences were concerned, but exhibited delayed effects of implausibility. These data indicate that while children and adults are alike in their basic thematic assignment processes during reading, children may be delayed in the efficiency with which they are able to integrate pragmatic and real world knowledge into their discourse representation.
Resumo:
The subject of climate feedbacks focuses attention on global mean surface air temperature (GMST) as the key metric of climate change. But what does knowledge of past and future GMST tell us about the climate of specific regions? In the context of the ongoing UNFCCC process, this is an important question for policy-makers as well as for scientists. The answer depends on many factors, including the mechanisms causing changes, the timescale of the changes, and the variables and regions of interest. This paper provides a review and analysis of the relationship between changes in GMST and changes in local climate, first in observational records and then in a range of climate model simulations, which are used to interpret the observations. The focus is on decadal timescales, which are of particular interest in relation to recent and near-future anthropogenic climate change. It is shown that GMST primarily provides information about forced responses, but that understanding and quantifying internal variability is essential to projecting climate and climate impacts on regional-to-local scales. The relationship between local forced responses and GMST is often linear but may be nonlinear, and can be greatly complicated by competition between different forcing factors. Climate projections are limited not only by uncertainties in the signal of climate change but also by uncertainties in the characteristics of real-world internal variability. Finally, it is shown that the relationship between GMST and local climate provides a simple approach to climate change detection, and a useful guide to attribution studies.
Resumo:
Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.