895 resultados para System failures (Engineering) -- Location
Resumo:
This paper discusses an ongoing project that aims at improving the potential for resilience of a system responsible for the planning of rail engineering work delivery. It focuses on the use of a human factors based approach as a way to achieve this end. In particular, the paper discusses the initial data collected by means of interviews and how this process gave way to a two fold goal: Understanding how the planning process works in reality and identifying any critical aspects of the system from a Resilience Engineering perspective. Given the nature of the process under study, information flows and communication issues have been given particular attention throughout the data collection and analysis stages. Initial data confirms that the planning process is greatly reliant on the capability of people using their knowledge and skills to communicate in a dynamic informational environment. Finally, the added value of the interviews is discussed from a human factors perspective and as a mean towards the aim of better understanding resilience in rail engineering planning.
Resumo:
This paper discusses an ongoing project that aims at improving the potential for resilience of a system responsible for the planning of rail engineering work delivery. This is being addressed by means of a methodology based on the observation and analysis of “real” planning activities, using resilience engineering concepts as a background. Interviews with planners have been carried out to provide an overview of the planning process and steer more in-depth investigation. Analysis of historic information and observation of planners’ main activities is underway. Given the nature of the process under study, information flows and communication issues have been given particular attention throughout the data collection and analysis stages. Initial data confirms that the planning process is greatly reliant on the capability of people using their knowledge and skills to communicate in a dynamic informational environment. Evidence was found of communication breakdowns at the boundaries of different planning levels and teams. The fact that the process is divided amongst several different areas of the organisation, often with different goals and needs, creates potential sources of conflict and tension.
Resumo:
Much of the published human factors work on risk is to do with safety and within this is concerned with prediction and analysis of human error and with human reliability assessment. Less has been published on human factors contributions to understanding and managing project, business, engineering and other forms of risk and still less jointly assessing risk to do with broad issues of ‘safety’ and broad issues of ‘production’ or ‘performance’. This paper contains a general commentary on human factors and assessment of risk of various kinds, in the context of the aims of ergonomics and concerns about being too risk averse. The paper then describes a specific project, in rail engineering, where the notion of a human factors case has been employed to analyse engineering functions and related human factors issues. A human factors issues register for potential system disturbances has been developed, prior to a human factors risk assessment, which jointly covers safety and production (engineering delivery) concerns. The paper concludes with a commentary on the potential relevance of a resilience engineering perspective to understanding rail engineering systems risk. Design, planning and management of complex systems will increasingly have to address the issue of making trade-offs between safety and production, and ergonomics should be central to this. The paper addresses the relevant issues and does so in an under-published domain – rail systems engineering work.
Resumo:
A recent area for investigation into the development of adaptable robot control is the use of living neuronal networks to control a mobile robot. The so-called Animat paradigm comprises a neuronal network (the ‘brain’) connected to an external embodiment (in this case a mobile robot), facilitating potentially robust, adaptable robot control and increased understanding of neural processes. Sensory input from the robot is provided to the neuronal network via stimulation on a number of electrodes embedded in a specialist Petri dish (Multi Electrode Array (MEA)); accurate control of this stimulation is vital. We present software tools allowing precise, near real-time control of electrical stimulation on MEAs, with fast switching between electrodes and the application of custom stimulus waveforms. These Linux-based tools are compatible with the widely used MEABench data acquisition system. Benefits include rapid stimulus modulation in response to neuronal activity (closed loop) and batch processing of stimulation protocols.
Resumo:
In the U.K., dental students require to perform training and practice on real human tissues at the very early stage of their courses. Currently, the human tissues, such as decayed teeth, are mounted in a human head like physical model. The problems with these models in teaching are; (1) every student operates on tooth, which are always unique; (2) the process cannot be recorded for examination purposes and (3) same training are not repeatable. The aim of the PHATOM Project is to develop a dental training system using Haptic technology. This paper documents the project background, specification, research and development of the first prototype system. It also discusses the research in the visual display, haptic devices and haptic rendering. This includes stereo vision, motion parallax, volumetric modelling, surface remapping algorithms as well as analysis design of the system. A new volumetric to surface model transformation algorithm is also introduced. This paper includes the future work on the system development and research.
Resumo:
Planning a project with proper considerations of all necessary factors and managing a project to ensure its successful implementation will face a lot of challenges. Initial stage in planning a project for bidding a project is costly, time consuming and usually with poor accuracy on cost and effort predictions. On the other hand, detailed information for previous projects may be buried in piles of archived documents which can be increasingly difficult to learn from the previous experiences. Project portfolio has been brought into this field aiming to improve the information sharing and management among different projects. However, the amount of information that could be shared is still limited to generic information. This paper, we report a recently developed software system COBRA to automatically generate a project plan with effort estimation of time and cost based on data collected from previous completed projects. To maximise the data sharing and management among different projects, we proposed a method of using product based planning from PRINCE2 methodology. (Automated Project Information Sharing and Management System -�COBRA) Keywords: project management, product based planning, best practice, PRINCE2
Resumo:
This paper presents the development of an indoor localization system using camera vision. The localization system has a capability to determine 2D coordinate (x, y) for a team of mobile robots, Miabot. The experimental results show that the system outperforms our existing sonar localizer both in accuracy and a precision.
Resumo:
Although climate models have been improving in accuracy and efficiency over the past few decades, it now seems that these incremental improvements may be slowing. As tera/petascale computing becomes massively parallel, our legacy codes are less suitable, and even with the increased resolution that we are now beginning to use, these models cannot represent the multiscale nature of the climate system. This paper argues that it may be time to reconsider the use of adaptive mesh refinement for weather and climate forecasting in order to achieve good scaling and representation of the wide range of spatial scales in the atmosphere and ocean. Furthermore, the challenge of introducing living organisms and human responses into climate system models is only just beginning to be tackled. We do not yet have a clear framework in which to approach the problem, but it is likely to cover such a huge number of different scales and processes that radically different methods may have to be considered. The challenges of multiscale modelling and petascale computing provide an opportunity to consider a fresh approach to numerical modelling of the climate (or Earth) system, which takes advantage of the computational fluid dynamics developments in other fields and brings new perspectives on how to incorporate Earth system processes. This paper reviews some of the current issues in climate (and, by implication, Earth) system modelling, and asks the question whether a new generation of models is needed to tackle these problems.
Resumo:
During deglaciation of the North American Laurentide Ice Sheet large proglacial lakes developed in positions where proglacial drainage was impeded by the ice margin. For some of these lakes, it is known that subsequent drainage had an abrupt and widespread impact on North Atlantic Ocean circulation and climate, but less is known about the impact that the lakes exerted on ice sheet dynamics. This paper reports palaeogeographic reconstructions of the evolution of proglacial lakes during deglaciation across the northwestern Canadian Shield, covering an area in excess of 1,000,000 km(2) as the ice sheet retreated some 600 km. The interactions between proglacial lakes and ice sheet flow are explored, with a particular emphasis on whether the disposition of lakes may have influenced the location of the Dubawnt Lake ice stream. This ice stream falls outside the existing paradigm for ice streams in the Laurentide Ice Sheet because it did not operate over fined-grained till or lie in a topographic trough. Ice margin positions and a digital elevation model are utilised to predict the geometry and depth of proglacial takes impounded at the margin at 30-km increments during deglaciation. Palaeogeographic reconstructions match well with previous independent estimates of lake coverage inferred from field evidence, and results suggest that the development of a deep lake in the Thelon drainage basin may have been influential in initiating the ice stream by inducing calving, drawing down ice and triggering fast ice flow. This is the only location alongside this sector of the ice sheet where large (>3000 km(2)), deep lakes (similar to120 m) are impounded for a significant length of time and exactly matches the location of the ice stream. It is speculated that the commencement of calving at the ice sheet margin may have taken the system beyond a threshold and was sufficient to trigger rapid motion but that once initiated, calving processes and losses were insignificant to the functioning of the ice stream. It is thus concluded that proglacial lakes are likely to have been an important control on ice sheet dynamics during deglaciation of the Laurentide Ice Sheet. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The ultimate criterion of success for interactive expert systems is that they will be used, and used to effect, by individuals other than the system developers. A key ingredient of success in most systems is involving users in the specification and development of systems as they are being built. However, until recently, system designers have paid little attention to ascertaining user needs and to developing systems with corresponding functionality and appropriate interfaces to match those requirements. Although the situation is beginning to change, many developers do not know how to go about involving users, or else tackle the problem in an inadequate way. This paper discusses the need for user involvement and considers why many developers are still not involving users in an optimal way. It looks at the different ways in which users can be involved in the development process and describes how to select appropriate techniques and methods for studying users. Finally, it discusses some of the problems inherent in involving users in expert system development, and recommends an approach which incorporates both ethnographic analysis and formal user testing.
Resumo:
The performance of boreal winter forecasts made with the European Centre for Medium-Range Weather Forecasts (ECMWF) System 11 Seasonal Forecasting System is investigated through analyses of ensemble hindcasts for the period 1987-2001. The predictability, or signal-to-noise ratio, associated with the forecasts, and the forecast skill are examined. On average, forecasts of 500 hPa geopotential height (GPH) have skill in most of the Tropics and in a few regions of the extratropics. There is broad, but not perfect, agreement between regions of high predictability and regions of high skill. However, model errors are also identified, in particular regions where the forecast ensemble spread appears too small. For individual winters the information provided by t-values, a simple measure of the forecast signal-to-noise ratio, is investigated. For 2 m surface air temperature (T2m), highest t-values are found in the Tropics but there is considerable interannual variability, and in the tropical Atlantic and Indian basins this variability is not directly tied to the El Nino Southern Oscillation. For GPH there is also large interannual variability in t-values, but these variations cannot easily be predicted from the strength of the tropical sea-surface-temperature anomalies. It is argued that the t-values for 500 hPa GPH can give valuable insight into the oceanic forcing of the atmosphere that generates predictable signals in the model. Consequently, t-values may be a useful tool for understanding, at a mechanistic level, forecast successes and failures. Lastly, the extent to which t-values are useful as a predictor of forecast skill is investigated. For T2m, t-values provide a useful predictor of forecast skill in both the Tropics and extratropics. Except in the equatorial east Pacific, most of the information in t-values is associated with interannual variability of the ensemble-mean forecast rather than interannual variability of the ensemble spread. For GPH, however, t-values provide a useful predictor of forecast skill only in the tropical Pacific region.
Resumo:
The COE/EBF gene family marks a subset of prospective neurons in the vertebrate central and peripheral. nervous system; including neurons deriving from some ectodermal placodes. Since placodes are often considered unique to vertebrates, we have characterised an amphioxus COE/EBF gene with the aim of using it as a marker to examine the timing and location of peripheral neuron differentiation. A single COE/EBF family member, AmphiCoe, was isolated from the amphioxus Branchiostoma floridae: AmphiCoe lies basal to the vertebrate COE/EBF genes in molecular phylogenetic analysis, suggesting that the duplications that formed the vertebrate COE/EBF family were specific to the vertebrate lineage. AmphiCoe is expressed in the central nervous system and in a small number of scattered ectodermal cells on the flanks of neurulae stage embryos. These cells become at least largely recessed beneath the ectoderm. Scanning electron microscopy was used to examine embryos in which the ectoderm had been partially peeled away. This revealed that these cells have neuronal morphology, and we infer that they are the precursors of epidermal primary sensory neurons. These characters lead us to suggest that differentiation of some ectodermal cells into sensory neurons with a tendency to sink beneath the embryonic surface represents a primitive feature that has become incorporated into placodes during vertebrate evolution. (C) 2004 Wiley-Liss, Inc.