970 resultados para Road safety algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low noise surfaces have been increasingly considered as a viable and cost-effective alternative to acoustical barriers. However, road planners and administrators frequently lack information on the correlation between the type of road surface and the resulting noise emission profile. To address this problem, a method to identify and classify different types of road pavements was developed, whereby near field road noise is analyzed using statistical learning methods. The vehicle rolling sound signal near the tires and close to the road surface was acquired by two microphones in a special arrangement which implements the Close-Proximity method. A set of features, characterizing the properties of the road pavement, was extracted from the corresponding sound profiles. A feature selection method was used to automatically select those that are most relevant in predicting the type of pavement, while reducing the computational cost. A set of different types of road pavement segments were tested and the performance of the classifier was evaluated. Results of pavement classification performed during a road journey are presented on a map, together with geographical data. This procedure leads to a considerable improvement in the quality of road pavement noise data, thereby increasing the accuracy of road traffic noise prediction models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ao nível da segurança de circulação rodoviária e aeroportuária, a aderência entre os pneumáticos e a superfície do pavimento apresenta-se como uma das características superficiais mais importantes dos pavimentos em situações de piso molhado. Mundialmente, tem-se dado relevante importância a este facto, levando a que nas últimas décadas e anos, se tenham estudado diversos índices e equipamentos de medição do coeficiente de atrito de um dado pavimento aquando da presença de água. O objectivo da presente dissertação é o de aprofundar o conhecimento sobre a medição do coeficiente de atrito da superfície dos pavimentos de infraestruturas rodoviárias e aeroportuárias, com a particular incidência no estudo do equipamento GripTester. O trabalho realizado consistiu principalmente na análise de valores do coeficiente de atrito da camada de desgaste medidos no pavimento de um aeródromo, por três equipamentos GripTester, através de um ensaio de comparação interlaboratorial que foi organizado especificamente no âmbito desta dissertação. O trabalho desenvolvido permitiu concluir que foi importante desenvolver este estudo comparativo. Com efeito, a análise dos resultados do ensaio de comparação interlaboratorial mostrou que há diferenças no desempenho dos equipamentos participantes, que interessa aprofundar em estudos comparativos futuros.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho foi realizado no âmbito do Trabalho Final de Mestrado em Engenharia Civil do Instituto Superior de Engenharia de Lisboa. O trabalho consistiu em realizar um Projecto de Execução de uma Passagem Superior com três tramos em Betão Armado e Pré-Esforçado. Foram tidos em conta os condicionamentos da topografia local, geotécnica e de traçado. O dimensionamento dos elementos estruturais foi efectuado de acordo com a regulamentação portuguesa actualmente em vigor, nomeadamente o Regulamento de Segurança e Acções (RSA) e o Regulamento de Estruturas de Betão Armado e Pré-esforçado (REBAP). A verificação da segurança foi efectuada em relação aos Estados Limites Últimos e de Utilização. Para a análise estrutural foi utilizado o programa de cálculo automático SAP2000.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Various support measures useful for promoting joint change approaches to the improvement of both shiftworking arrangements and safety and health management systems were reviewed. A particular focus was placed on enterprise-level risk reduction measures linking working hours and management systems. METHODS: Voluntary industry-based guidelines on night and shift work for department stores and the chemical, automobile and electrical equipment industries were examined. Survey results that had led to the compilation of practicable measures to be included in these guidelines were also examined. The common support measures were then compared with ergonomic checkpoints for plant maintenance work involving irregular nightshifts. On the basis of this analysis, a new night and shift work checklist was designed. RESULTS: Both the guidelines and the plant maintenance work checkpoints were found to commonly cover multiple issues including work schedules and various job-related risks. This close link between shiftwork arrangements and risk management was important as shiftworkers in these industries considered teamwork and welfare services to be essential for managing risks associated with night and shift work. Four areas found suitable for participatory improvement by managers and workers were work schedules, ergonomic work tasks, work environment and training. The checklist designed to facilitate participatory change processes covered all these areas. CONCLUSIONS: The checklist developed to describe feasible workplace actions was suitable for integration with comprehensive safety and health management systems and offered valuable opportunities for improving working time arrangements and job content together.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human Computer Interaction (HCl) is to interaction between computers and each person. And context-aware (CA) is very important one of HCI composition. In particular, if there are sequential or continuous tasks between users and devices, among users, and among devices etc, it is important to decide the next action using right CA. And to take perfect decision we have to get together all CA into a structure. We define that structure is Context-Aware Matrix (CAM) in this article. However to make exact decision is too hard for some problems like low accuracy, overhead and bad context by attacker etc. Many researcher has been studying to solve these problems. Moreover, still it has weak point HCI using in safety. In this Article, we propose CAM making include best selecting Server in each area. As a result, moving users could be taken the best way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An atmospheric aerosol study was performed in 2008 inside an urban road tunnel, in Lisbon, Portugal. Using a high volume impactor, the aerosol was collected into four size fractions (PM0.5, PM0.5-1, PM1-2.5 and PM2.5-10) and analysed for particle mass (PM), organic and elemental carbon (OC and EC), polycyclic aromatic hydrocarbons (PAH), soluble inorganic ions and elemental composition. Three main groups of compounds were discriminated in the tunnel aerosol: carbonaceous, soil component and vehicle mechanical wear. Measurements indicate that Cu can be a good tracer for wear emissions of road traffic. Cu levels correlate strongly with Fe, Mn, Sn and Cr, showing a highly linear constant ratio in all size ranges, suggesting a unique origin through sizes. Ratios of Cu with other elements can be used to source apportion the trace elements present in urban atmospheres, mainly on what concerns coarse aerosol particles. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims at investigating the impact of treating breast cancer using different radiation therapy (RT) techniques – forwardly-planned intensity-modulated, f-IMRT, inversely-planned IMRT and dynamic conformal arc (DCART) RT – and their effects on the whole-breast irradiation and in the undesirable irradiation of the surrounding healthy tissues. Two algorithms of iPlan BrainLAB treatment planning system were compared: Pencil Beam Convolution (PBC) and commercial Monte Carlo (iMC). Seven left-sided breast patients submitted to breast-conserving surgery were enrolled in the study. For each patient, four RT techniques – f-IMRT, IMRT using 2-fields and 5-fields (IMRT2 and IMRT5, respectively) and DCART – were applied. The dose distributions in the planned target volume (PTV) and the dose to the organs at risk (OAR) were compared analyzing dose–volume histograms; further statistical analysis was performed using IBM SPSS v20 software. For PBC, all techniques provided adequate coverage of the PTV. However, statistically significant dose differences were observed between the techniques, in the PTV, OAR and also in the pattern of dose distribution spreading into normal tissues. IMRT5 and DCART spread low doses into greater volumes of normal tissue, right breast, right lung and heart than tangential techniques. However, IMRT5 plans improved distributions for the PTV, exhibiting better conformity and homogeneity in target and reduced high dose percentages in ipsilateral OAR. DCART did not present advantages over any of the techniques investigated. Differences were also found comparing the calculation algorithms: PBC estimated higher doses for the PTV, ipsilateral lung and heart than the iMC algorithm predicted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: A major focus of data mining process - especially machine learning researches - is to automatically learn to recognize complex patterns and help to take the adequate decisions strictly based on the acquired data. Since imaging techniques like MPI – Myocardial Perfusion Imaging on Nuclear Cardiology, can implicate a huge part of the daily workflow and generate gigabytes of data, there could be advantages on Computerized Analysis of data over Human Analysis: shorter time, homogeneity and consistency, automatic recording of analysis results, relatively inexpensive, etc.Objectives: The aim of this study relates with the evaluation of the efficacy of this methodology on the evaluation of MPI Stress studies and the process of decision taking concerning the continuation – or not – of the evaluation of each patient. It has been pursued has an objective to automatically classify a patient test in one of three groups: “Positive”, “Negative” and “Indeterminate”. “Positive” would directly follow to the Rest test part of the exam, the “Negative” would be directly exempted from continuation and only the “Indeterminate” group would deserve the clinician analysis, so allowing economy of clinician’s effort, increasing workflow fluidity at the technologist’s level and probably sparing time to patients. Methods: WEKA v3.6.2 open source software was used to make a comparative analysis of three WEKA algorithms (“OneR”, “J48” and “Naïve Bayes”) - on a retrospective study using the comparison with correspondent clinical results as reference, signed by nuclear cardiologist experts - on “SPECT Heart Dataset”, available on University of California – Irvine, at the Machine Learning Repository. For evaluation purposes, criteria as “Precision”, “Incorrectly Classified Instances” and “Receiver Operating Characteristics (ROC) Areas” were considered. Results: The interpretation of the data suggests that the Naïve Bayes algorithm has the best performance among the three previously selected algorithms. Conclusions: It is believed - and apparently supported by the findings - that machine learning algorithms could significantly assist, at an intermediary level, on the analysis of scintigraphic data obtained on MPI, namely after Stress acquisition, so eventually increasing efficiency of the entire system and potentially easing both roles of Technologists and Nuclear Cardiologists. In the actual continuation of this study, it is planned to use more patient information and significantly increase the population under study, in order to allow improving system accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final para a obtenção do grau de Mestre em Engenharia Civil