780 resultados para Driver fatigue risk management
Resumo:
O objetivo deste estudo é identificar dentro de um instituto público, um cenário tecnológico para recuperação de negócio através de soluções de recuperação de baixo investimento e que permita simultaneamente a redução das despesas operacionais. Para a elaboração do presente estudo foi utilizada a metodologia de investigação científica de estudo de caso com propósito exploratório de forma a obtenção de respostas às questões propostas. Relativamente ao estado da arte adotou-se uma metodologia baseada na investigação da literatura referente ao plano de continuidade de negócio, gestão de risco, análise de risco e avaliação ou análise de impacto de negócio. Da pesquisa efetuada, através de consultadoria externa, concluiu-se que, por aplicação da metodologia Análise do impacto nos negócios, os sistemas considerados extremamente críticos e estratégicos para a organização são o sistema de correio eletrónico, sistema central de diretório, repositório de ficheiros e o principal sistema de informação de suporte ao negócio. Foi ainda possível identificar, através do estudo de caso os cenários de recuperação de desastres que melhor se ajustam à situação atual da organização em estudo, por responderem às questões da pesquisa.
Resumo:
No âmbito do Mestrado em Gestão da Prevenção de Riscos Laborais, procede-se a uma investigação com vista à identificação dos Fatores de Riscos Psicossociais nos trabalhadores dos Serviços Administrativos e de Emergência das Delegações Centro e Sul do Instituto Nacional de Emergência Médica - INEM. Pretende-se colmatar essa lacuna visto que o INEM, até então, nunca efetuou nenhuma avaliação dos Fatores de Riscos Psicossociais, sendo que as investigações realizadas no instituto são relativas ao stresse ocupacional. A pesquisa exploratória e descritiva, com enfoque simultaneamente quantitativo e qualitativo, concretiza-se mediante a realização de entrevistas semiestruturadas às chefias de cada delegação e aplicação do questionário F-Psico versão 3.0 (Escala de Valoración de los riesgos psicosociales do Instituto Nacional de Seguridad e Higiene en el Trabajo – INSHT) aos trabalhadores dos serviços do Centro e do Sul do INEM. Participaram do estudo 185 trabalhadores, sendo 10% pertencentes à Delegação Centro e 14% pertencentes à Delegação Sul. Com os resultados obtidos pretende-se propor medidas preventivas, com vista a contribuir para a eliminação ou redução dos Riscos Psicossociais identificados. / Regarding the Masters Degree in Prevention of Labor Risk Management, an investigation aimed to identify the Psychosocial Risk Factors in the emergency and administrative workers in the Central and Southern Delegations of the National Institute of Medical Emergency. Aimed fill the gap because the National Institute of Medical Emergency, till then, never made any assessment of Psychosocial Risk Factors, and the investigations performed in the institute are related to occupational stress. The research which is simultaneously quantitative and qualitative, descriptive and exploratory will be will held bearing in mind the semi-structured interviews to the employers of each delegation and application of the Psychosocial Risk Factors identification questionnaire F-Psico 3.0 to the services workers of the central and southern delegations of de National Institute of Medical Emergency. The study included 185 workers, with 10% belonging to the Central Delegation and 14% belonging to the Southern Delegation. With the results we will propose preventive or corrective measures to eliminate or reduce the Psychosocial Risks identified.
Resumo:
El riesgo está inmerso en todas las actividades humanas y es entendido como la probabilidad de ocurrencia de un evento que deviene en un perjuicio. Por ende al hablar de riesgo en materia financiera, nos referimos a una eventual pérdida de dinero que signifique, de manera directa, una afectación al sistema financiero o a una de las instituciones que lo conforman. Con lo antedicho, el presente trabajo consiste en una recopilación de información de algunos de los riegos inmersos en los mercados financieros. Con ello, la exposición de riesgos como los de mercado, operacional, de liquidez, legal, reputacional, entre otros, resulta fundamental. Finalmente hemos optado por presentar, en breves rasgos, algunos de los riesgos de crédito y métodos válidos para su adecuada valoración.
Resumo:
The performance of a 2D numerical model of flood hydraulics is tested for a major event in Carlisle, UK, in 2005. This event is associated with a unique data set, with GPS surveyed wrack lines and flood extent surveyed 3 weeks after the flood. The Simple Finite Volume (SFV) model is used to solve the 2D Saint-Venant equations over an unstructured mesh of 30000 elements representing channel and floodplain, and allowing detailed hydraulics of flow around bridge piers and other influential features to be represented. The SFV model is also used to corroborate flows recorded for the event at two gauging stations. Calibration of Manning's n is performed with a two stage strategy, with channel values determined by calibration of the gauging station models, and floodplain values determined by optimising the fit between model results and observed water levels and flood extent for the 2005 event. RMS error for the calibrated model compared with surveyed water levels is ~±0.4m, the same order of magnitude as the estimated error in the survey data. The study demonstrates the ability of unstructured mesh hydraulic models to represent important hydraulic processes across a range of scales, with potential applications to flood risk management.
Resumo:
This paper is from a study on specialist and trade contracting in the construction industry. The research was commissioned by CIRIA and undertaken by the University of Reading in conjunction with Sir Alexander Gibb & Partners Ltd. The purpose of the work was to provide guidance for effective and equitable practice in the management of projects where much of the work is executed, and possibly designed, by specialist and trade contractors (STCs). As part of this study, a preliminary investigation into the nature and origins of specialist contracting was undertaken, in conjunction with a survey of the problems confronting STCs. This paper presents that phase of the project.
Resumo:
Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, where asset value and population density are greatest, the model spatial resolution required to represent flows through a typical street network (i.e. < 10m) often results in impractical computational cost at the whole city scale. Explicit diffusive storage cell models become very inefficient at such high resolutions, relative to shallow water models, because the stable time step in such schemes scales as a quadratic of resolution. This paper presents the calibration and evaluation of a recently developed new formulation of the LISFLOOD-FP model, where stability is controlled by the Courant–Freidrichs–Levy condition for the shallow water equations, such that, the stable time step instead scales linearly with resolution. The case study used is based on observations during the summer 2007 floods in Tewkesbury, UK. Aerial photography is available for model evaluation on three separate days from the 24th to the 31st of July. The model covered a 3.6 km by 2 km domain and was calibrated using gauge data from high flows during the previous month. The new formulation was benchmarked against the original version of the model at 20 m and 40 m resolutions, demonstrating equally accurate performance given the available validation data but at 67x faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in a significantly more accurate simulation of the drying dynamics compared to that simulated by the coarse resolution models, although estimates of peak inundation depth were similar.
Resumo:
Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.
Resumo:
The paper examines how European retailers are using private standards for food safety and,quality as risk management and competitive tools and the strategic responses of leading Kenyan and other developing country supplier/exporters to such standards. Despite measures to harmonize a 'single market', the European fresh produce market is very diverse in terms of consumer preferences, structural dynamics and attention to and enforcement of food safety and other standards. Leading Kenyan fresh produce suppliers have re-positioned themselves at the high end, including 'high care', segments of the market - precisely those that are most demanding in terms of quality assurance and food safety systems. An array of factors have influenced this strategic positioning, including relatively high international freight costs, the emergence of more effective competition in mainstream product lines, relatively low labor costs for produce preparation, and strong market relationships with selected retail chains. To succeed in this demanding market segment, the industry has had to invest substantially in improved production and procurement systems, upgraded pack house facilities, and quality assurance/food safety management systems. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The Bahrain International Circuit (BIC) is considered its one of the best international racing car track in terms of technical aspects and architectural quality. Two Formula 1 races have been hosted in the Kingdom of Bahrain, in 2004 and 2005, at BIC. The BIC had recently won the award of the best international racing car circuit. This paper highlights on the elements that contributed to the success of such project starting from the architectural aspects, construction, challenges, tendering process, risk management, the workforce, speed of the construction method, and future prospects for harnessing solar and wind energy for sustainable electrification and production of water for the circuit, i.e. making BIC green and environment-friendly international circuit.
Resumo:
Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, the model spatial resolution required to represent flows through a typical street network often results in an impractical computational cost at the city scale. This paper presents the calibration and evaluation of a recently developed formulation of the LISFLOOD-FP model, which is more computationally efficient at these resolutions. Aerial photography was available for model evaluation on 3 days from the 24 to the 31 of July. The new formulation was benchmarked against the original version of the model at 20 and 40 m resolutions, demonstrating equally accurate simulation, given the evaluation data but at a 67 times faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in more accurate simulation of the floodplain drying dynamics compared with the coarse resolution models, although maximum inundation levels were simulated equally well at all resolutions tested.
Resumo:
Multi-factor approaches to analysis of real estate returns have, since the pioneering work of Chan, Hendershott and Sanders (1990), emphasised a macro-variables approach in preference to the latent factor approach that formed the original basis of the arbitrage pricing theory. With increasing use of high frequency data and trading strategies and with a growing emphasis on the risks of extreme events, the macro-variable procedure has some deficiencies. This paper explores a third way, with the use of an alternative to the standard principal components approach – independent components analysis (ICA). ICA seeks higher moment independence and maximises in relation to a chosen risk parameter. We apply an ICA based on kurtosis maximisation to weekly US REIT data using a kurtosis maximising algorithm. The results show that ICA is successful in capturing the kurtosis characteristics of REIT returns, offering possibilities for the development of risk management strategies that are sensitive to extreme events and tail distributions.
Resumo:
A significant part of bank lending in the UK is secured on commercial property and valuations play an important part in this process. They are an integral part of risk management within the banking sector. It is therefore important that valuations are independent and objective and are used properly to ensure that secured lending is soundly based from the perspective of both lender and borrower. The purpose of this research is to examine objectivity and transparency in the valuation process for bank lending and to identify any influences which may undermine the process. A detailed analysis of 31 valuation negligence cases has been followed by two focus groups of lenders and valuers and also questionnaire surveys of commercial lenders and valuers. Many stakeholders exist, for example lenders, borrowers and brokers, who are able to influence the process in various ways. The strongest evidence of overt influence in the process comes from the method of valuer selection with borrowers and brokers seen to be heavily involved. There is some also some evidence of influence during the draft valuation process. A significant minority of valuers feel that inappropriate pressure is applied by borrowers and brokers yet there is no apparent part of the process that leads to this. The panel system employed by lenders is found to be a significant part of the system and merits further examination. The pressure felt by valuers needs more investigation along with the question of if and how the process could dispel such feelings. This is seen as particularly important in the context of bank regulation.
Resumo:
There is widespread evidence that the volatility of stock returns displays an asymmetric response to good and bad news. This article considers the impact of asymmetry on time-varying hedges for financial futures. An asymmetric model that allows forecasts of cash and futures return volatility to respond differently to positive and negative return innovations gives superior in-sample hedging performance. However, the simpler symmetric model is not inferior in a hold-out sample. A method for evaluating the models in a modern risk-management framework is presented, highlighting the importance of allowing optimal hedge ratios to be both time-varying and asymmetric.