110 resultados para Localization real-world challenges
Resumo:
The reversibility of the Atlantic meridional overturning circulation (AMOC) is investigated in multi-model experiments using global climate models (GCMs) where CO2 concentrations are increased by 1 or 2 % per annum to 2× or 4× preindustrial conditions. After a period of stabilisation the CO2 is decreased back to preindustrial conditions. In most experiments when the CO2 decreases, the AMOC recovers before becoming anomalously strong. This "overshoot" is up to an extra 18.2Sv or 104 % of its preindustrial strength, and the period with an anomalously strong AMOC can last for several hundred years. The magnitude of this overshoot is shown to be related to the build up of salinity in the subtropical Atlantic during the previous period of high CO2 levels. The magnitude of this build up is partly related to anthropogenic changes in the hydrological cycle. The mechanisms linking the subtropical salinity increase to the subsequent overshoot are analysed, supporting the relationship found. This understanding is used to explain differences seen in some models and scenarios. In one experiment there is no overshoot because there is little salinity build up, partly as a result of model differences in the hydrological cycle response to increased CO2 levels and partly because of a less aggressive scenario. Another experiment has a delayed overshoot, possibly as a result of a very weak AMOC in that GCM when CO2 is high. This study identifies aspects of overshoot behaviour that are robust across a multi-model and multi-scenario ensemble, and those that differ between experiments. These results could inform an assessment of the real-world AMOC response to decreasing CO2.
Resumo:
Although Theory of International Politics is a standard-bearer for explanatory theory in international relations (IR), Waltz’s methodology has been subject to numerous quite disparate analyses. One reason why it has proved hard to pin down is that too little attention has been paid to how, in practice, Waltz approaches real-world problems. Despite his neopositivist rhetoric, Waltz applies neorealism in a notably loose, even indeterminate, fashion. There is therefore a disjunction between what he says and what he does. This is partly explained by his unsatisfactory attempt to reconcile his avowed neopositivism with his belief that international politics is characterized by organized complexity. The inconsistencies thus created also help to make sense of why competing interpretations of his methodology have emerged. Some aspects of his work do point beyond these particular methodological travails in ways that will continue to be of interest to IR theorists, but its most enduring methodological lesson may be that rhetoric and practice do not necessarily fit harmoniously together.
Resumo:
Lord Kelvin (William Thomson) made important contributions to the study of atmospheric elec- tricity during a brief but productive period from 1859–1861. By 1859 Kelvin had recognised the need for “incessant recording” of atmospheric electrical parameters, and responded by inventing both the water dropper equaliser for measuring the atmospheric potential gradient (PG), and photographic data logging. The water dropper equaliser was widely adopted internationally and is still in use today. Following theoretical consid- erations of electric field distortion by local topography, Kelvin developed a portable electrometer, using it to investigate the PG on the Scottish island of Arran. During these environmental measurements, Kelvin may have unwittingly detected atmospheric PG changes during solar activity in August / September 1859 associated with the “Carrington event”, which is interesting in the context of his later statements that solar magnetic influ- ence on the Earth was impossible. Kelvin’s atmospheric electricity work presents an early representative study in quantitative environmental physics, through the application of mathematical principles to an environmental problem, the design and construction of bespoke instrumentation for real world measurements and recognising the limitations of the original theoretical view revealed by experimental work
Resumo:
Students in the architecture, engineering, and construction disciplines are often challenged with visualizing and understanding the complex spatial and temporal relationships involved in designing and constructing three-dimensional (3D) structures. An evolving body of research traces the use of educational computer simulations to enhance student learning experiences through testing real-world scenarios and the development of student decision-making skills. Ongoing research at Pennsylvania State University aims to improve engineering education in construction through interactive construction project learning applications in an immersive virtual reality environment. This paper describes the first- and second-generation development of the Virtual Construction Simulator (VCS), a tool that enables students to simultaneously create and review construction schedules through 3D model interaction. The educational value and utility of VCS was assessed through surveys, focus group interviews, and a student exercise conducted in a construction management class. Results revealed VCS is a valuable and effective four-dimensional (4D) model creation and schedule review application that fosters collaborative work and greater student task focus. This paper concludes with a discussion of the findings and the future development steps of the VCS educational simulation
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.
Resumo:
We propose the Tetra Pak case as a real-world example to study the implications of multiproduct activity for European Competition Policy. Tetra Pak, a monopolist in aseptic carton packaging of liquid food, competes with Elopak in the nonaseptic sector. The EC Commission used the effect of Tetra Pak's dominance in the aseptic sector on its rival's performance as an evidence of the former's anticompetitive behavior. With linear demand and cost functions and interdependent demands, the Commission's position can be supported. However, a more general model suggests that the Commission's conclusions cannot be supported as the unique outcome of the analysis of the information available.
Resumo:
An efficient market incorporates news into prices immediately and fully. Tests for efficiency in financial markets have been undermined by information leakage. We test for efficiency in sports betting markets – real-world markets where news breaks remarkably cleanly. Applying a novel identification to high-frequency data, we investigate the reaction of prices to goals scored on the ‘cusp’ of half-time. This strategy allows us to separate the market's response to major news (a goal), from its reaction to the continual flow of minor game-time news. On our evidence, prices update swiftly and fully.
Resumo:
Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.
Resumo:
This paper examines the lead–lag relationship between the FTSE 100 index and index futures price employing a number of time series models. Using 10-min observations from June 1996–1997, it is found that lagged changes in the futures price can help to predict changes in the spot price. The best forecasting model is of the error correction type, allowing for the theoretical difference between spot and futures prices according to the cost of carry relationship. This predictive ability is in turn utilised to derive a trading strategy which is tested under real-world conditions to search for systematic profitable trading opportunities. It is revealed that although the model forecasts produce significantly higher returns than a passive benchmark, the model was unable to outperform the benchmark after allowing for transaction costs.
Resumo:
One of the most challenging tasks in financial management for large governmental and industrial organizations is Planning and Budgeting (P&B). The processes involved with P&B are cost and time intensive, especially when dealing with uncertainties and budget adjustments during the planning horizon. This work builds on our previous research in which we proposed and evaluated a fuzzy approach that allows optimizing the budget interactively beyond the initial planning stage. In this research we propose an extension that handles financial stress (i.e. drastic budget cuts) occurred during the budget period. This is done by introducing fuzzy stress parameters which are used to re-distribute the budget in order to minimize the negative impact of the financial stress. The benefits and possible issues of this approach are analyzed critically using a real world case study from the Nuremberg Institute of Technology (NIT). Additionally, ongoing and future research directions are presented.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
As the fidelity of virtual environments (VE) continues to increase, the possibility of using them as training platforms is becoming increasingly realistic for a variety of application domains, including military and emergency personnel training. In the past, there was much debate on whether the acquisition and subsequent transfer of spatial knowledge from VEs to the real world is possible, or whether the differences in medium during training would essentially be an obstacle to truly learning geometric space. In this paper, the authors present various cognitive and environmental factors that not only contribute to this process, but also interact with each other to a certain degree, leading to a variable exposure time requirement in order for the process of spatial knowledge acquisition (SKA) to occur. The cognitive factors that the authors discuss include a variety of individual user differences such as: knowledge and experience; cognitive gender differences; aptitude and spatial orientation skill; and finally, cognitive styles. Environmental factors discussed include: Size, Spatial layout complexity and landmark distribution. It may seem obvious that since every individual's brain is unique - not only through experience, but also through genetic predisposition that a one size fits all approach to training would be illogical. Furthermore, considering that various cognitive differences may further emerge when a certain stimulus is present (e.g. complex environmental space), it would make even more sense to understand how these factors can impact spatial memory, and to try to adapt the training session by providing visual/auditory cues as well as by changing the exposure time requirements for each individual. The impact of this research domain is important to VE training in general, however within service and military domains, guaranteeing appropriate spatial training is critical in order to ensure that disorientation does not occur in a life or death scenario.
Resumo:
The extent of the surface area sunlit is critical for radiative energy exchanges and therefore for a wide range of applications that require urban land surface models (ULSM), ranging from human comfort to weather forecasting. Here a computational demanding shadow casting algorithm is used to assess the capability of a simple single-layer urban canopy model, which assumes an infinitely long rotating canyon (ILC), to reproduce sunlit areas on roof and roads over central London. Results indicate that the sunlit roads areas are well-represented but somewhat smaller using an ILC, while sunlit roofs areas are consistently larger, especially for dense urban areas. The largest deviations from real world sunlit areas are found for roofs during mornings and evenings. Indications that sunlit fractions on walls are overestimated using an ILC during mornings and evenings are found. The implications of these errors are dependent on the application targeted. For example, (independent of albedo) ULSMs used in numerical weather prediction applying ILC representation of the urban form will overestimate outgoing shortwave radiation from roofs due to the overestimation of sunlit fraction of the roofs. Complications of deriving height to width ratios from real world data are also discussed.
Resumo:
Background Models of the development and maintenance of childhood anxiety suggest an important role for parent cognitions: that is, negative expectations of children's coping abilities lead to parenting behaviors that maintain child anxiety. The primary aims of the current study were to (1) compare expectations of child vulnerability and coping among mothers of children with anxiety disorders on the basis of whether or not mothers also had a current anxiety disorder, and (2) examine the degree to which the association between maternal anxiety disorder status and child coping expectations was mediated by how mothers interpreted ambiguous material that referred to their own experience. Methods The association between interpretations of threat, negative emotion, and control was assessed using hypothetical ambiguous scenarios in a sample of 271 anxious and nonanxious mothers of 7- to 12-year-old children with an anxiety disorder. Mothers also rated their expectations when presented with real life challenge tasks. Results There was a significant association between maternal anxiety disorder status and negative expectations of child coping behaviors. Mothers’ self-referent interpretations were found to mediate this relationship. Responses to ambiguous hypothetical scenarios correlated significantly with responses to real life challenge tasks. Conclusions Treatments for childhood anxiety disorders in the context of parental anxiety disorders may benefit from the inclusion of a component to directly address parental cognitions. Some inconsistencies were found when comparing maternal expectations in response to hypothetical scenarios with real life challenges. This should be addressed in future research.