957 resultados para Real blow up


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an anomaly on the market of military shooters of the 21st century, Spec Ops: The Line entails a journey of undetermined realities and modern warfare consequences. In this study, the narrative is analyzed from the perspective of Jean Baudrillard’s idea that simulations have replaced our conception of reality. Both the protagonist and the player of Spec Ops will unavoidably descend into a state of the hyperreal. They experience multiple possible realities within the game narrative and end up unable to comprehend what has transpired. The hyperreal is defined as the state in which it is impossible to discern reality from simulation. The simulation of reality has proliferated itself into being the reality, and the original has been lost. The excessive use of violence, direct approach of the player through a break with the 4th wall and a deceitful narrator contribute to this loss of reality within the game. Although the game represents simulacra, being a simulation in itself, the object of study is the coexisting state of hyperreal shared between protagonist and player when comprehending events in the game. In the end, neither part can understand or discern with any certainty what transpired within the game.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents results from the first use of neural networks for the real-time feedback control of high temperature plasmas in a Tokamak fusion experiment. The Tokamak is currently the principal experimental device for research into the magnetic confinement approach to controlled fusion. In the Tokamak, hydrogen plasmas, at temperatures of up to 100 Million K, are confined by strong magnetic fields. Accurate control of the position and shape of the plasma boundary requires real-time feedback control of the magnetic field structure on a time-scale of a few tens of microseconds. Software simulations have demonstrated that a neural network approach can give significantly better performance than the linear technique currently used on most Tokamak experiments. The practical application of the neural network approach requires high-speed hardware, for which a fully parallel implementation of the multi-layer perceptron, using a hybrid of digital and analogue technology, has been developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is concerned with the development of distributed real-time systems, in which software is used for the control of concurrent physical processes. These distributed control systems are required to periodically coordinate the operation of several autonomous physical processes, with the property of an atomic action. The implementation of this coordination must be fault-tolerant if the integrity of the system is to be maintained in the presence of processor or communication failures. Commit protocols have been widely used to provide this type of atomicity and ensure consistency in distributed computer systems. The objective of this research is the development of a class of robust commit protocols, applicable to the coordination of distributed real-time control systems. Extended forms of the standard two phase commit protocol, that provides fault-tolerant and real-time behaviour, were developed. Petri nets are used for the design of the distributed controllers, and to embed the commit protocol models within these controller designs. This composition of controller and protocol model allows the analysis of the complete system in a unified manner. A common problem for Petri net based techniques is that of state space explosion, a modular approach to both the design and analysis would help cope with this problem. Although extensions to Petri nets that allow module construction exist, generally the modularisation is restricted to the specification, and analysis must be performed on the (flat) detailed net. The Petri net designs for the type of distributed systems considered in this research are both large and complex. The top down, bottom up and hybrid synthesis techniques that are used to model large systems in Petri nets are considered. A hybrid approach to Petri net design for a restricted class of communicating processes is developed. Designs produced using this hybrid approach are modular and allow re-use of verified modules. In order to use this form of modular analysis, it is necessary to project an equivalent but reduced behaviour on the modules used. These projections conceal events local to modules that are not essential for the purpose of analysis. To generate the external behaviour, each firing sequence of the subnet is replaced by an atomic transition internal to the module, and the firing of these transitions transforms the input and output markings of the module. Thus local events are concealed through the projection of the external behaviour of modules. This hybrid design approach preserves properties of interest, such as boundedness and liveness, while the systematic concealment of local events allows the management of state space. The approach presented in this research is particularly suited to distributed systems, as the underlying communication model is used as the basis for the interconnection of modules in the design procedure. This hybrid approach is applied to Petri net based design and analysis of distributed controllers for two industrial applications that incorporate the robust, real-time commit protocols developed. Temporal Petri nets, which combine Petri nets and temporal logic, are used to capture and verify causal and temporal aspects of the designs in a unified manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research concerns the development of coordination and co-governance within three different regeneration programmes within one Midlands city over the period from 1999 to 2002. The New Labour government, in office since 1997, had an agenda for ‘joining-up’ government, part of which has had considerable impact in the area of regeneration policy. Joining-up government encompasses a set of related activities which can include the coordination of policy-making and service delivery. In regeneration, it also includes a commitment to operate through co-governance. Central government and local and regional organisations have sought to put this idea into practice by using what may be referred to as network management processes. Many characteristics of new policies are designed to address the management of networks. Network management is not new in this area, it has developed at least since the early 1990s with the City Challenge and Single Regeneration Budget (SRB) programmes as a way of encouraging more inclusive and effective regeneration interventions. Network management theory suggests that better management can improve decision-making outcomes in complex networks. The theories and concepts are utilised in three case studies as a way of understanding how and why regeneration attempts demonstrate real advances in inter-organisational working at certain times whilst faltering at others. Current cases are compared to the historical case of the original SRB programme as a method of assessing change. The findings suggest that: The use of network management can be identified at all levels of governance. As previous literature has highlighted, central government is the most important actor regarding network structuring. However, it can be argued that network structuring and game management are both practised by central and local actors; Furthermore, all three of the theoretical perspectives within network management (Instrumental, Institutional and Interactive), have been identified within UK regeneration networks. All may have a role to play with no single perspective likely to succeed on its own. Therefore, all could make an important contribution to the understanding of how groups can be brought together to work jointly; The findings support Klijn’s (1997) assertion that the institutional perspective is dominant for understanding network management processes; Instrumentalism continues on all sides, as the acquisition of resources remains the major driver for partnership activity; The level of interaction appears to be low despite the intentions for interactive decision-making; Overall, network management remains partial. Little attention is paid to the issues of accountability or to the institutional structures which can prevent networks from implementing the policies designed by central government, and/or the regional tier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor networks have been identified as one of the key technologies for the 21st century. In order to overcome their limitations such as fault tolerance and conservation of energy, we propose a middleware solution, In-Motes. In-Motes stands as a fault tolerant platform for deploying and monitoring applications in real time offers a number of possibilities for the end user giving him in parallel the freedom to experiment with various parameters, in an effort the deployed applications to run in an energy efficient manner inside the network. The proposed scheme is evaluated through the In-Motes EYE application, aiming to test its merits under real time conditions. In-Motes EYE application which is an agent based real time In-Motes application developed for sensing acceleration variations in an environment. The application was tested in a prototype area, road alike, for a period of four months.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wider scientific community now accept that the threat of climate change as real and thus acknowledge the importance of implementing adaptation measures in a global context. In the UK , the physical effects of climate change are likely to be directly felt in the form of extreme weather events, which are predicted to escalate in number and severity in future under the changing climatic conditions. Construction industry; which consists of supply chains running across various other industries, economies and regions, will also be affected due to these events. Thus, it is important that the construction organisations are well prepared to withstand the effects of extreme weather events not only directly affecting their organisations but also affecting their supply chains which in turn might affect the organisation concerned. Given the fact that more than 99% of construction sector businesses are SMEs, the area can benefit significantly from policy making to improve SME resilience and coping capacity. This paper presents the literature review and synthesis of a doctoral research study undertaken to address the issue of extreme weather resilience of construction sector SMEs and their supply chains. The main contribution of the paper to both academia and practitioners is a synthesis model that conceptualises the factors that enhances resilience of SMEs and their supply chains against extreme weather events. This synthesis model forms the basis of a decision making framework that will enable SMEs to both reduce their vulnerability and enhance their coping capacity against extreme weather. The value of this paper is further extended by the overall research design that is set forth as the way forward.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scale-up from shake flasks to bioreactors allows for the more reproducible, high-yielding production of recombinant proteins in yeast. The ability to control growth conditions through real-time monitoring facilitates further optimization of the process. The setup of a 3-L stirred-tank bioreactor for such an application is described. © 2012 Springer Science+business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In-Motes Bins is an agent based real time In-Motes application developed for sensing light and temperature variations in an environment. In-Motes is a mobile agent middleware that facilitates the rapid deployment of adaptive applications in Wireless Sensor Networks (WSN's). In-Motes Bins is based on the injection of mobile agents into the WSN that can migrate or clone following specific rules and performing application specific tasks. Using In-Motes we were able to create and rapidly deploy our application on a WSN consisting of 10 MICA2 motes. Our application was tested in a wine store for a period of four months. In this paper we present the In-Motes Bins application and provide a detailed evaluation of its implementation. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aims of this thesis were to investigate the neuropsychological, neurophysiological, and cognitive contributors to mobility changes with increasing age. In a series of studies with adults aged 45-88 years, unsafe pedestrian behaviour and falls were investigated in relation to i) cognitive functions (including response time variability, executive function, and visual attention tests), ii) mobility assessments (including gait and balance and using motion capture cameras), iii) motor initiation and pedestrian road crossing behavior (using a simulated pedestrian road scene), iv) neuronal and functional brain changes (using a computer based crossing task with magnetoencephalography), and v) quality of life questionnaires (including fear of falling and restricted range of travel). Older adults are more likely to be fatally injured at the far-side of the road compared to the near-side of the road, however, the underlying mobility and cognitive processes related to lane-specific (i.e. near-side or far-side) pedestrian crossing errors in older adults is currently unknown. The first study explored cognitive, motor initiation, and mobility predictors of unsafe pedestrian crossing behaviours. The purpose of the first study (Chapter 2) was to determine whether collisions at the near-side and far-side would be differentially predicted by mobility indices (such as walking speed and postural sway), motor initiation, and cognitive function (including spatial planning, visual attention, and within participant variability) with increasing age. The results suggest that near-side unsafe pedestrian crossing errors are related to processing speed, whereas far-side errors are related to spatial planning difficulties. Both near-side and far-side crossing errors were related to walking speed and motor initiation measures (specifically motor initiation variability). The salient mobility predictors of unsafe pedestrian crossings determined in the above study were examined in Chapter 3 in conjunction with the presence of a history of falls. The purpose of this study was to determine the extent to which walking speed (indicated as a salient predictor of unsafe crossings and start-up delay in Chapter 2), and previous falls can be predicted and explained by age-related changes in mobility and cognitive function changes (specifically within participant variability and spatial ability). 53.2% of walking speed variance was found to be predicted by self-rated mobility score, sit-to-stand time, motor initiation, and within participant variability. Although a significant model was not found to predict fall history variance, postural sway and attentional set shifting ability was found to be strongly related to the occurrence of falls within the last year. Next in Chapter 4, unsafe pedestrian crossing behaviour and pedestrian predictors (both mobility and cognitive measures) from Chapter 2 were explored in terms of increasing hemispheric laterality of attentional functions and inter-hemispheric oscillatory beta power changes associated with increasing age. Elevated beta (15-35 Hz) power in the motor cortex prior to movement, and reduced beta power post-movement has been linked to age-related changes in mobility. In addition, increasing recruitment of both hemispheres has been shown to occur and be beneficial to perform similarly to younger adults in cognitive tasks (Cabeza, Anderson, Locantore, & McIntosh, 2002). It has been hypothesised that changes in hemispheric neural beta power may explain the presence of more pedestrian errors at the farside of the road in older adults. The purpose of the study was to determine whether changes in age-related cortical oscillatory beta power and hemispheric laterality are linked to unsafe pedestrian behaviour in older adults. Results indicated that pedestrian errors at the near-side are linked to hemispheric bilateralisation, and neural overcompensation post-movement, 4 whereas far-side unsafe errors are linked to not employing neural compensation methods (hemispheric bilateralisation). Finally, in Chapter 5, fear of falling, life space mobility, and quality of life in old age were examined to determine their relationships with cognition, mobility (including fall history and pedestrian behaviour), and motor initiation. In addition to death and injury, mobility decline (such as pedestrian errors in Chapter 2, and falls in Chapter 3) and cognition can negatively affect quality of life and result in activity avoidance. Further, number of falls in Chapter 3 was not significantly linked to mobility and cognition alone, and may be further explained by a fear of falling. The objective of the above study (Study 2, Chapter 3) was to determine the role of mobility and cognition on fear of falling and life space mobility, and the impact on quality of life measures. Results indicated that missing safe pedestrian crossing gaps (potentially indicating crossing anxiety) and mobility decline were consistent predictors of fear of falling, reduced life space mobility, and quality of life variance. Social community (total number of close family and friends) was also linked to life space mobility and quality of life. Lower cognitive functions (particularly processing speed and reaction time) were found to predict variance in fear of falling and quality of life in old age. Overall, the findings indicated that mobility decline (particularly walking speed or walking difficulty), processing speed, and intra-individual variability in attention (including motor initiation variability) are salient predictors of participant safety (mainly pedestrian crossing errors) and wellbeing with increasing age. More research is required to produce a significant model to explain the number of falls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use data on exchange rates and consumer price indices and the weighting matrix derived by Bayoumi, Lee and Jaewoo (2006) to calculate consumer price index-based REER. The main novelties of our database are that (1) it includes data for 178 countries –many more than in any other publicly available database– plus an external REER for the euro area, using a consistent methodology; (2) it includes up-to-date REER values, such as data for January 2012; and (3) it is relatively easy to calculate REER against any arbitrary group of countries. The annual database is complete for 172 countries and the euro area for 1992-2011 and data is available for six other countries for a shorter period. For several countries annual data is available for earlier years as well, eg data is available for 67 countries from 1960. The monthly database is complete for 138 countries for January 1995-January 2012, and data is also available for 15 other countries for a shorter period. The indicators calculated by us are freely downloadable and will be irregularly updated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a dissertation about urban systems; within this broad subject I tackle three issues, one that focuses on an observed inter-city relationship and two that focus on an intra-city phenomenon. In Chapter II I adapt a model of random emergence of economic opportunities from the firm growth literature to the urban dynamics situation and present several predictions for urban system dynamics. One of these predictions is that the older the city the larger and more diversified it is going to be on average, which I proceed to verify empirically using two distinct datasets. In Chapter III I analyze the Residential Real Estate Bubble that took place in Miami-Dade County from 1999 to 2006. I adopt a Spatial-Economic model developed for the Paris Bubble episode of 1984–1993 and formulate an innovative test of the results in terms of speculative intensity on the basis of proxies of investor activity available in my dataset. My results support the idea that the best or more expensive areas are also where the greatest speculative activity takes place and where the rapid increase in prices begins. The most significant departure from previous studies that emerges in my results is the absence of a wider gap between high priced areas and low priced areas in the peak year. I develop a measure of dispersion in value among areas and contrast the Miami-Dade and Paris episodes. In Chapter IV I analyze the impact on tax equity of a Florida tax-limiting legislation known as Save Our Homes. I first compare homesteaded and non-homesteaded properties, and second, look within the subset of homesteaded properties. I find that non-homesteaded properties increase their share of taxes paid relative to homesteaded properties during an up market, but that this is reversed during a down market. For the subset of homesteaded properties I find that the impact on tax equity of SOH will depend on differential growth rates among higher and lower valued homes, but during times of rapid home price appreciation, in a scenario of no differential growth rates in property values, SOH increases progressivity relative to the prior system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cortisol awakening response (CAR) is typically measured in the domestic setting. Moderate sample timing inaccuracy has been shown to result in erroneous CAR estimates and such inaccuracy has been shown partially to explain inconsistency in the CAR literature. The need for more reliable measurement of the CAR has recently been highlighted in expert consensus guidelines where it was pointed out that less than 6% of published studies provided electronic-monitoring of saliva sampling time in the post-awakening period. Analyses of a merged data-set of published studies from our laboratory are presented. To qualify for selection, both time of awakening and collection of the first sample must have been verified by electronic-monitoring and sampling commenced within 15 min of awakening. Participants (n = 128) were young (median age of 20 years) and healthy. Cortisol values were determined in the 45 min post-awakening period on 215 sampling days. On 127 days, delay between verified awakening and collection of the first sample was less than 3 min (‘no delay’ group); on 45 days there was a delay of 4–6 min (‘short delay’ group); on 43 days the delay was 7–15 min (‘moderate delay’ group). Cortisol values for verified sampling times accurately mapped on to the typical post-awakening cortisol growth curve, regardless of whether sampling deviated from desired protocol timings. This provides support for incorporating rather than excluding delayed data (up to 15 min) in CAR analyses. For this population the fitted cortisol growth curve equation predicted a mean cortisol awakening level of 6 nmols/l (±1 for 95% CI) and a mean CAR rise of 6 nmols/l (±2 for 95% CI). We also modelled the relationship between real delay and CAR magnitude, when the CAR is calculated erroneously by incorrectly assuming adherence to protocol time. Findings supported a curvilinear hypothesis in relation to effects of sample delay on the CAR. Short delays of 4–6 min between awakening and commencement of saliva sampling resulted an overestimated CAR. Moderate delays of 7–15 min were associated with an underestimated CAR. Findings emphasize the need to employ electronic-monitoring of sampling accuracy when measuring the CAR in the domestic setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An array of Bio-Argo floats equipped with radiometric sensors has been recently deployed in various open ocean areas representative of the diversity of trophic and bio-optical conditions prevailing in the so-called Case 1 waters. Around solar noon and almost everyday, each float acquires 0-250 m vertical profiles of Photosynthetically Available Radiation and downward irradiance at three wavelengths (380, 412 and 490 nm). Up until now, more than 6500 profiles for each radiometric channel have been acquired. As these radiometric data are collected out of operator’s control and regardless of meteorological conditions, specific and automatic data processing protocols have to be developed. Here, we present a data quality-control procedure aimed at verifying profile shapes and providing near real-time data distribution. This procedure is specifically developed to: 1) identify main issues of measurements (i.e. dark signal, atmospheric clouds, spikes and wave-focusing occurrences); 2) validate the final data with a hierarchy of tests to ensure a scientific utilization. The procedure, adapted to each of the four radiometric channels, is designed to flag each profile in a way compliant with the data management procedure used by the Argo program. Main perturbations in the light field are identified by the new protocols with good performances over the whole dataset. This highlights its potential applicability at the global scale. Finally, the comparison with modeled surface irradiances allows assessing the accuracy of quality-controlled measured irradiance values and identifying any possible evolution over the float lifetime due to biofouling and instrumental drift.