985 resultados para textual complexity assessment
Resumo:
The continuous improvement of Ethernet technologies is boosting the eagerness of extending their use to cover factory-floor distributed real time applications. Indeed, it is remarkable the considerable amount of research work that has been devoted to the timing analysis of Ethernet-based technologies in the past few years. It happens, however, that the majority of those works are restricted to the analysis of sub-sets of the overall computing and communication system, thus without addressing timeliness in a holistic fashion. To this end, we address an approach, based on simulation, aiming at extracting temporal properties of commercial-off-the-shelf (COTS) Ethernet-based factory-floor distributed systems. This framework is applied to a specific COTS technology, Ethernet/IP. We reason about the modeling and simulation of Ethernet/IP-based systems, and on the use of statistical analysis techniques to provide useful results on timeliness. The approach is part of a wider framework related to the research project INDEPTH NDustrial-Ethernet ProTocols under Holistic analysis.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
OBJECTIVE: To assess the health risk of exposure to benzene for a community affected by a fuel leak. METHODS: Data regarding the fuel leak accident with, which occurred in the Brasilia, Federal District, were obtained from the Fuel Distributor reports provided to the environmental authority. Information about the affected population (22 individuals) was obtained from focal groups of eight individuals. Length of exposure and water benzene concentration were estimated through a groundwater flow model associated with a benzene propagation model. The risk assessment was conducted according to the Agency for Toxic Substances and Disease Registry methodology. RESULTS: A high risk perception related to the health consequences of the accident was evident in the affected community (22 individuals), probably due to the lack of assistance and a poor risk communication from government authorities and the polluting agent. The community had been exposed to unsafe levels of benzene (> 5 µg/L) since December 2001, five months before they reported the leak. The mean benzene level in drinking water (72.2 µg/L) was higher than that obtained by the Fuel Distributer using the Risk Based Corrective Action methodology (17.2 µg/L).The estimated benzene intake from the consumption of water and food reached a maximum of 0.0091 µg/kg bw/day (5 x 10-7 cancer risk per 106 individuals). The level of benzene in water vapor while showering reached 7.5 µg/m3 for children (1 per 104 cancer risk). Total cancer risk ranged from 110 to 200 per 106 individuals. CONCLUSIONS: The population affected by the fuel leak was exposed to benzene levels that might have represented a health risk. Local government authorities need to develop better strategies to respond rapidly to these types of accidents to protect the health of the affected population and the environment.
Resumo:
The present work aims to study the feasibility of deploying a farm of sea current turbines for electricity generation in Portugal. An approach to the tides, which are they, how they are formed, its prediction, is held. It is also conducted a study about the energy of sea currents and it is presented some technology about ocean currents too. A model of tidal height and velocity of the currents it is also developed. The energy produced by a hypothetical park, built in Sines (Portugal), is calculated and afterwards, an economical assessment is performed for two possible scenarios and a sensitivity analysis of NVP (Net Present Value) and LCOE (Levelized Cost of Energy) is figured. The conclusions about the feasibility of the projects are also presented. Despite being desired due to its predictability, this energy source is not yet economically viable as it is in an initial state of development. To push investment in this technology a feed-in tariff of, at least €200/MWh, should be considered.
Resumo:
The present work aims to study the feasibility of deploying a farm of sea current turbines for electricity generation in Portugal. An approach to the tides, which are they, how they are formed, its prediction, is held. It is also conducted a study about the energy of sea currents and it is presented some technology about ocean currents too. A model of tidal height and velocity of the currents it is also developed. The energy produced by a hypothetical park, built in Sines (Portugal), is calculated and afterwards, an economical assessment is performed for two possible scenarios and a sensitivity analysis of NVP (Net Present Value) and LCOE (Levelized Cost of Energy) is figured. The conclusions about the feasibility of the projects are also presented. Despite being desired due to its predictability, this energy source is not yet economically viable as it is in an initial state of development. To push investment in this technology a feed-in tariff of, at least €200/MWh, should be considered.
Resumo:
The definition and programming of distributed applications has become a major research issue due to the increasing availability of (large scale) distributed platforms and the requirements posed by the economical globalization. However, such a task requires a huge effort due to the complexity of the distributed environments: large amount of users may communicate and share information across different authority domains; moreover, the “execution environment” or “computations” are dynamic since the number of users and the computational infrastructure change in time. Grid environments, in particular, promise to be an answer to deal with such complexity, by providing high performance execution support to large amount of users, and resource sharing across different organizations. Nevertheless, programming in Grid environments is still a difficult task. There is a lack of high level programming paradigms and support tools that may guide the application developer and allow reusability of state-of-the-art solutions. Specifically, the main goal of the work presented in this thesis is to contribute to the simplification of the development cycle of applications for Grid environments by bringing structure and flexibility to three stages of that cycle through a commonmodel. The stages are: the design phase, the execution phase, and the reconfiguration phase. The common model is based on the manipulation of patterns through pattern operators, and the division of both patterns and operators into two categories, namely structural and behavioural. Moreover, both structural and behavioural patterns are first class entities at each of the aforesaid stages. At the design phase, patterns can be manipulated like other first class entities such as components. This allows a more structured way to build applications by reusing and composing state-of-the-art patterns. At the execution phase, patterns are units of execution control: it is possible, for example, to start or stop and to resume the execution of a pattern as a single entity. At the reconfiguration phase, patterns can also be manipulated as single entities with the additional advantage that it is possible to perform a structural reconfiguration while keeping some of the behavioural constraints, and vice-versa. For example, it is possible to replace a behavioural pattern, which was applied to some structural pattern, with another behavioural pattern. In this thesis, besides the proposal of the methodology for distributed application development, as sketched above, a definition of a relevant set of pattern operators was made. The methodology and the expressivity of the pattern operators were assessed through the development of several representative distributed applications. To support this validation, a prototype was designed and implemented, encompassing some relevant patterns and a significant part of the patterns operators defined. This prototype was based in the Triana environment; Triana supports the development and deployment of distributed applications in the Grid through a dataflow-based programming model. Additionally, this thesis also presents the analysis of a mapping of some operators for execution control onto the Distributed Resource Management Application API (DRMAA). This assessment confirmed the suitability of the proposed model, as well as the generality and flexibility of the defined pattern operators
Resumo:
This paper studies the information content of the chromosomes of 24 species. In a first phase, a scheme inspired in dynamical system state space representation is developed. For each chromosome the state space dynamical evolution is shed into a two dimensional chart. The plots are then analyzed and characterized in the perspective of fractal dimension. This information is integrated in two measures of the species’ complexity addressing its average and variability. The results are in close accordance with phylogenetics pointing quantitative aspects of the species’ genomic complexity.
Resumo:
Compositional schedulability analysis of hierarchical realtime systems is a well-studied problem. Various techniques have been developed to abstract resource requirements of components in such systems, and schedulability has been addressed using these abstract representations (also called component interfaces). These approaches for compositional analysis incur resource overheads when they abstract components into interfaces. In this talk, we define notions of resource schedulability and optimality for component interfaces, and compare various approaches.
Resumo:
Existing work in the context of energy management for real-time systems often ignores the substantial cost of making DVFS and sleep state decisions in terms of time and energy and/or assume very simple models. Within this paper we attempt to explore the parameter space for such decisions and possible constraints faced.
Resumo:
Diaphragm is the principal inspiratory muscle. Different techniques have been used to assess diaphragm motion. Among them, M-mode ultrasound has gain particular interest since it is non-invasive and accessible. However it is operator-dependent and no objective acquisition protocol has been established. Purpose: to establish a reliable method for the assessment of the diaphragmatic motion via the M-mode ultrasound.
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
Alzheimer Disease (AD) is characterized by progressive cognitive decline and dementia. Earlier diagnosis and classification of different stages of the disease are currently the main challenges and can be assessed by neuroimaging. With this work we aim to evaluate the quality of brain regions and neuroimaging metrics as biomarkers of AD. Multimodal Imaging Brain Connectivity Analysis (MIBCA) toolbox functionalities were used to study AD by T1weighted, Diffusion Tensor Imaging and 18FAV45 PET, with data obtained from the AD Neuroimaging Initiative database, specifically 12 healthy controls (CTRL) and 33 patients with early mild cognitive impairment (EMCI), late MCI (LMCI) and AD (11 patients/group). The metrics evaluated were gray-matter volume (GMV), cortical thickness (CThk), mean diffusivity (MD), fractional anisotropy (FA), fiber count (FiberConn), node degree (Deg), cluster coefficient (ClusC) and relative standard-uptake-values (rSUV). Receiver Operating Characteristic (ROC) curves were used to evaluate and compare the diagnostic accuracy of the most significant metrics and brain regions and expressed as area under the curve (AUC). Comparisons were performed between groups. The RH-Accumbens/Deg demonstrated the highest AUC when differentiating between CTRLEMCI (82%), whether rSUV presented it in several brain regions when distinguishing CTRL-LMCI (99%). Regarding CTRL-AD, highest AUC were found with LH-STG/FiberConn and RH-FP/FiberConn (~100%). A larger number of neuroimaging metrics related with cortical atrophy with AUC>70% was found in CTRL-AD in both hemispheres, while in earlier stages, cortical metrics showed in more confined areas of the temporal region and mainly in LH, indicating an increasing of the spread of cortical atrophy that is characteristic of disease progression. In CTRL-EMCI several brain regions and neuroimaging metrics presented AUC>70% with a worst result in later stages suggesting these indicators as biomarkers for an earlier stage of MCI, although further research is necessary.
Resumo:
Fungi are essential to the survival of our global ecology, but they might pose a significant threat to the health of occupants when they grow in our buildings. The exposure to fungi in homes is a significant risk factor for a number of respiratory symptoms. Well-known illnesses caused by fungi include allergy and hypersensitivity pneumonitis. Environmental monitoring for fungi and their disease agents are important aspects of exposure assessment, but few guidelines exist for interpreting their health impacts. This book answers the questions: How does one detect and measure the presence of indoor fungi? What is an acceptable level of indoor fungi? How do we relate this information to human health problems?
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.
Resumo:
The work agenda includes the production of a report on different doctoral programmes on “Technology Assessment” in Europe, the US and Japan, in order to analyse collaborative post-graduation activities. Finally, the proposals on collaborative post-graduation programme between FCTUNL and ITAS-FZK will be organised by an ongoing discussion process with colleagues from ITAS.