951 resultados para Worst-case execution-time


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado em Gestão Integrada da Qualidade, Ambiente e Segurança

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este documento descreve um modelo de tolerância a falhas para sistemas de tempo-real distribuídos. A sugestão deste modelo tem como propósito a apresentação de uma solu-ção fiável, flexível e adaptável às necessidades dos sistemas de tempo-real distribuídos. A tolerância a falhas é um aspeto extremamente importante na construção de sistemas de tempo-real e a sua aplicação traz inúmeros benefícios. Um design orientado para a to-lerância a falhas contribui para um melhor desempenho do sistema através do melhora-mento de aspetos chave como a segurança, a confiabilidade e a disponibilidade dos sis-temas. O trabalho desenvolvido centra-se na prevenção, deteção e tolerância a falhas de tipo ló-gicas (software) e físicas (hardware) e assenta numa arquitetura maioritariamente basea-da no tempo, conjugada com técnicas de redundância. O modelo preocupa-se com a efi-ciência e os custos de execução. Para isso utilizam-se também técnicas tradicionais de to-lerância a falhas, como a redundância e a migração, no sentido de não prejudicar o tempo de execução do serviço, ou seja, diminuindo o tempo de recuperação das réplicas, em ca-so de ocorrência de falhas. Neste trabalho são propostas heurísticas de baixa complexida-de para tempo-de-execução, a fim de se determinar para onde replicar os componentes que constituem o software de tempo-real e de negociá-los num mecanismo de coordena-ção por licitações. Este trabalho adapta e estende alguns algoritmos que fornecem solu-ções ainda que interrompidos. Estes algoritmos são referidos em trabalhos de investiga-ção relacionados, e são utilizados para formação de coligações entre nós coadjuvantes. O modelo proposto colmata as falhas através de técnicas de replicação ativa, tanto virtual como física, com blocos de execução concorrentes. Tenta-se melhorar ou manter a sua qualidade produzida, praticamente sem introduzir overhead de informação significativo no sistema. O modelo certifica-se que as máquinas escolhidas, para as quais os agentes migrarão, melhoram iterativamente os níveis de qualidade de serviço fornecida aos com-ponentes, em função das disponibilidades das respetivas máquinas. Caso a nova configu-ração de qualidade seja rentável para a qualidade geral do serviço, é feito um esforço no sentido de receber novos componentes em detrimento da qualidade dos já hospedados localmente. Os nós que cooperam na coligação maximizam o número de execuções para-lelas entre componentes paralelos que compõem o serviço, com o intuito de reduzir atra-sos de execução. O desenvolvimento desta tese conduziu ao modelo proposto e aos resultados apresenta-dos e foi genuinamente suportado por levantamentos bibliográficos de trabalhos de in-vestigação e desenvolvimento, literaturas e preliminares matemáticos. O trabalho tem também como base uma lista de referências bibliográficas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poster presented in The 28th GI/ITG International Conference on Architecture of Computing Systems (ARCS 2015). 24 to 26, Mar, 2015. Porto, Portugal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the scientific evidence showing that arsenic (As), cadmium (Cd), and nickel (Ni) are human genotoxic carcinogens, the European Union (EU) recently set target values for metal concentration in ambient air (As: 6 ng/m3, Cd: 5 ng/m3, Ni: 20 ng/m3). The aim of our study was to determine the concentration levels of these trace elements in Porto Metropolitan Area (PMA) in order to assess whether compliance was occurring with these new EU air quality standards. Fine (PM2.5) and inhalable (PM10) air particles were collected from October 2011 to July 2012 at two different (urban and suburban) locations in PMA. Samples were analyzed for trace elements content by inductively coupled plasma–mass spectrometry (ICP-MS). The study focused on determination of differences in trace elements concentration between the two sites, and between PM2.5 and PM10, in order to gather information regarding emission sources. Except for chromium (Cr), the concentration of all trace elements was higher at the urban site. However, results for As, Cd, Ni, and lead (Pb) were well below the EU limit/target values (As: 1.49 ± 0.71 ng/m3; Cd: 1.67 ± 0.92 ng/m3; Ni: 3.43 ± 3.23 ng/m3; Pb: 17.1 ± 10.1 ng/m3) in the worst-case scenario. Arsenic, Cd, Ni, Pb, antimony (Sb), selenium (Se), vanadium (V), and zinc (Zn) were predominantly associated to PM2.5, indicating that anthropogenic sources such as industry and road traffic are the main source of these elements. High enrichment factors (EF > 100) were obtained for As, Cd, Pb, Sb, Se, and Zn, further confirming their anthropogenic origin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We elaborated an alternative culture method, which we denominated PKO (initials in tribute of respect to Petroff, Kudoh and Ogawa), for isolating Mycobacterium tuberculosis from sputum for diagnosis of pulmonary tuberculosis (TB), and to compare its performance with the Swab and Petroff methods. For the technique validation, sputum samples from patients suspected of pulmonary TB cases were examined by acid-fast microscopy (direct and concentrated smear), PKO, Swab and Petroff methods. We found that Petroff and PKO methods have parity in the effectiveness of M. tuberculosis isolation. However, by the PKO method, 65% of isolated strains were detected in a period of £15 days, while by the Petroff method the best detection was in an interval of 16-29 days (71%). In positive smear samples, the average time of PKO isolation is only superior to the one related for Bactec 460TB. In conclusion, the exclusion of the neutralization stage of pH in the PKO reduces the manipulation of the samples, diminishes the execution time of the culture according to the Petroff method and facilitates the qualification of professionals involved in the laboratorial diagnosis of Tuberculosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of the construction caused a need to use more effective equipments, capable of meeting the increasingly demanding deadlines for the completion of works. In this context, the safety and efficiency of equipment have become key aspects in order to optimize the execution time of the works, as well as reducing labor costs and loss of materials. With the evolution of construction and construction processes, cranes have come to represent a signal of the construction of buildings, revealing to be, in most of the cases, the main equipment of construction sites. Currently, some engineers revels some apprehension regarding the use and handling of cranes which is natural and acceptable, since an equipment failure can lead to serious or fatal accidents. The factors affecting safety management of the cranes in construction sites were investigated, identified, classified and evaluated according to their degree of importance, through interviews with representatives of the general contractors of a set of selected construction sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Potential risks of a secondary formation of polychlorinated dibenzodioxins/furans (PCDD/Fs) were assessed for two cordierite-based, wall-through diesel particulate filters (DPFs) for which soot combustion was either catalyzed with an iron- or a copper-based fuel additive. A heavy duty diesel engine was used as test platform, applying the eight-stage ISO 8178/4 C1 cycle. DPF applications neither affected the engine performance, nor did they increase NO, NO2, CO, and CO2 emissions. The latter is a metric for fuel consumption. THC emissions decreased by about 40% when deploying DPFs. PCDD/F emissions, with a focus on tetra- to octachlorinated congeners, were compared under standard and worst case conditions (enhanced chlorine uptake). The iron-catalyzed DPF neither increased PCDD/F emissions, nor did it change the congener pattern, even when traces of chlorine became available. In case of copper, PCDD/F emissions increased by up to 3 orders of magnitude from 22 to 200 to 12 700 pg I-TEQ/L with fuels of < 2, 14, and 110 microg/g chlorine, respectively. Mainly lower chlorinated DD/Fs were formed. Based on these substantial effects on PCDD/F emissions, the copper-catalyzed DPF system was not approved for workplace applications, whereas the iron system fulfilled all the specifications of the Swiss procedures for DPF approval (VERT).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CISNE es un sistema de cómputo en paralelo del Departamento de Arquitectura de Computadores y Sistemas Operativos (DACSO). Para poder implementar políticas de ordenacción de colas y selección de trabajos, este sistema necesita predecir el tiempo de ejecución de las aplicaciones. Con este trabajo se pretende proveer al sistema CISNE de un método para predecir el tiempo de ejecución basado en un histórico donde se almacenarán todos los datos sobre las ejecuciones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En termes de temps d'execució i ús de dades, les aplicacions paral·leles/distribuïdes poden tenir execucions variables, fins i tot quan s'empra el mateix conjunt de dades d'entrada. Existeixen certs aspectes de rendiment relacionats amb l'entorn que poden afectar dinàmicament el comportament de l'aplicació, tals com: la capacitat de la memòria, latència de la xarxa, el nombre de nodes, l'heterogeneïtat dels nodes, entre d'altres. És important considerar que l'aplicació pot executar-se en diferents configuracions de maquinari i el desenvolupador d'aplicacions no port garantir que els ajustaments de rendiment per a un sistema en particular continuïn essent vàlids per a d'altres configuracions. L'anàlisi dinàmica de les aplicacions ha demostrat ser el millor enfocament per a l'anàlisi del rendiment per dues raons principals. En primer lloc, ofereix una solució molt còmoda des del punt de vista dels desenvolupadors mentre que aquests dissenyen i evaluen les seves aplicacions paral·leles. En segon lloc, perquè s'adapta millor a l'aplicació durant l'execució. Aquest enfocament no requereix la intervenció de desenvolupadors o fins i tot l'accés al codi font de l'aplicació. S'analitza l'aplicació en temps real d'execució i es considra i analitza la recerca dels possibles colls d'ampolla i optimitzacions. Per a optimitzar l'execució de l'aplicació bioinformàtica mpiBLAST, vam analitzar el seu comportament per a identificar els paràmetres que intervenen en el rendiment d'ella, com ara: l'ús de la memòria, l'ús de la xarxa, patrons d'E/S, el sistema de fitxers emprat, l'arquitectura del processador, la grandària de la base de dades biològica, la grandària de la seqüència de consulta, la distribució de les seqüències dintre d'elles, el nombre de fragments de la base de dades i/o la granularitat dels treballs assignats a cada procés. El nostre objectiu és determinar quins d'aquests paràmetres tenen major impacte en el rendiment de les aplicacions i com ajustar-los dinàmicament per a millorar el rendiment de l'aplicació. Analitzant el rendiment de l'aplicació mpiBLAST hem trobat un conjunt de dades que identifiquen cert nivell de serial·lització dintre l'execució. Reconeixent l'impacte de la caracterització de les seqüències dintre de les diferents bases de dades i una relació entre la capacitat dels workers i la granularitat de la càrrega de treball actual, aquestes podrien ser sintonitzades dinàmicament. Altres millores també inclouen optimitzacions relacionades amb el sistema de fitxers paral·lel i la possibilitat d'execució en múltiples multinucli. La grandària de gra de treball està influenciat per factors com el tipus de base de dades, la grandària de la base de dades, i la relació entre grandària de la càrrega de treball i la capacitat dels treballadors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performance prediction and application behavior modeling have been the subject of exten- sive research that aim to estimate applications performance with an acceptable precision. A novel approach to predict the performance of parallel applications is based in the con- cept of Parallel Application Signatures that consists in extract an application most relevant parts (phases) and the number of times they repeat (weights). Executing these phases in a target machine and multiplying its exeuction time by its weight an estimation of the application total execution time can be made. One of the problems is that the performance of an application depends on the program workload. Every type of workload affects differently how an application performs in a given system and so affects the signature execution time. Since the workloads used in most scientific parallel applications have dimensions and data ranges well known and the behavior of these applications are mostly deterministic, a model of how the programs workload affect its performance can be obtained. We create a new methodology to model how a program’s workload affect the parallel application signature. Using regression analysis we are able to generalize each phase time execution and weight function to predict an application performance in a target system for any type of workload within predefined range. We validate our methodology using a synthetic program, benchmarks applications and well known real scientific applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A basic prerequisite for in vivo X-ray imaging of the lung is the exact determination of radiation dose. Achieving resolutions of the order of micrometres may become particularly challenging owing to increased dose, which in the worst case can be lethal for the imaged animal model. A framework for linking image quality to radiation dose in order to optimize experimental parameters with respect to dose reduction is presented. The approach may find application for current and future in vivo studies to facilitate proper experiment planning and radiation risk assessment on the one hand and exploit imaging capabilities on the other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the suitability of an improved version of an automatic segmentation method based on geodesic active regions (GAR) for segmenting cerebral vasculature with aneurysms from 3D X-ray reconstruc-tion angiography (3DRA) and time of °ight magnetic resonance angiography (TOF-MRA) images available in the clinical routine.Methods: Three aspects of the GAR method have been improved: execution time, robustness to variability in imaging protocols and robustness to variability in image spatial resolutions. The improved GAR was retrospectively evaluated on images from patients containing intracranial aneurysms in the area of the Circle of Willis and imaged with two modalities: 3DRA and TOF-MRA. Images were obtained from two clinical centers, each using di®erent imaging equipment. Evaluation included qualitative and quantitative analyses ofthe segmentation results on 20 images from 10 patients. The gold standard was built from 660 cross-sections (33 per image) of vessels and aneurysms, manually measured by interventional neuroradiologists. GAR has also been compared to an interactive segmentation method: iso-intensity surface extraction (ISE). In addition, since patients had been imaged with the two modalities, we performed an inter-modality agreement analysis with respect to both the manual measurements and each of the two segmentation methods. Results: Both GAR and ISE di®ered from the gold standard within acceptable limits compared to the imaging resolution. GAR (ISE, respectively) had an average accuracy of 0.20 (0.24) mm for 3DRA and 0.27 (0.30) mm for TOF-MRA, and had a repeatability of 0.05 (0.20) mm. Compared to ISE, GAR had a lower qualitative error in the vessel region and a lower quantitative error in the aneurysm region. The repeatabilityof GAR was superior to manual measurements and ISE. The inter-modality agreement was similar between GAR and the manual measurements. Conclusions: The improved GAR method outperformed ISE qualitatively as well as quantitatively and is suitable for segmenting 3DRA and TOF-MRA images from clinical routine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Around 11.5 * 106 m3 of rock detached from the eastern slope of the Santa Cruz valley (San Juan province, Argentina) in the first fortnight of January 2005. The rockslide?debris avalanche blocked the course, resulting in the development of a lake with maximum length of around 3.5 km. The increase in the inflow rate from 47,000?74,000 m3/d between April and October to 304,000 m3/d between late October and the first fortnight of November, accelerated the growing rate of the lake. On 12 November 2005 the dam failed, releasing 24.6 * 106 m3 of water. The resulting outburst flood caused damages mainly on infrastructure, and affected the facilities of a hydropower dam which was under construction 250 km downstream from the source area. In this work we describe causes and consequences of the natural dam formation and failure, and we dynamically model the 2005 rockslide?debris avalanche with DAN3D. Additionally, as a volume ~ 24 * 106 m3of rocks still remain unstable in the slope, we use the results of the back analysis to forecast the formation of a future natural dam. We analyzed two potential scenarios: a partial slope failure of 6.5 * 106 m3 and a worst case where all the unstable volume remaining in the slope fails. The spreading of those potential events shows that a new blockage of the Santa Cruz River is likely to occur. According to their modeled morphometry and the contributing watershed upstream the blockage area, as the one of 2005, the dams would also be unstable. This study shows the importance of back and forward analysis that can be carried out to obtain critical information for land use planning, hazards mitigation, and emergency management.