974 resultados para Control-flow Analysis
Resumo:
"March 1984."
Resumo:
Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.
Resumo:
The article proposes the model of management of information about program flow analysis for conducting computer experiments with program transformations. It considers the architecture and context of the flow analysis subsystem within the framework of Specialized Knowledge Bank on Program Transformations and describes the language for presenting flow analysis methods in the knowledge bank.
Resumo:
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be
Resumo:
Graphite is a mineral commodity used as anode for lithium-ion batteries (LIBs), and its global demand is doomed to increase significantly in the future due to the forecasted global market demand of electric vehicles. Currently, the graphite used to produce LIBs is a mix of synthetic and natural graphite. The first one is produced by the crystallization of petroleum by-products and the second comes from mining, which causes threats related to pollution, social acceptance, and health. This MSc work has the objective of determining compositional and textural characteristics of natural, synthetic, and recycled graphite by using SEM-EDS, XRF, XRD, and TEM analytical techniques and couple these data with dynamic Material Flow Analysis (MFA) models, which have the objective of predicting the future global use of graphite in order to test the hypothesis that natural graphite will no longer be used in the LIB market globally. The mineral analyses reveal that the synthetic graphite samples contain less impurities than the natural graphite, which has a rolled internal structure similar to the recycled one. However, recycled graphite shows fractures and discontinuities of the graphene layers caused by the recycling process, but its rolled internal structure can help the Li-ions’ migration through the fractures. Three dynamic MFA studies have been conducted to test distinct scenarios that include graphite recycling in the period 2022-2050 and it emerges that - irrespective of any considered scenario - there will be an increase of synthetic graphite demand, caused by the limited stocks of battery scrap available. Hence, I conclude that both natural and recycled graphite is doomed to be used in the LIB market in the future, at least until the year 2050 when the stock of recycled graphite production will be enough to supersede natural graphite. In addition, some new improvement in the dismantling and recycling processes are necessary to improve the quality of recycled graphite.
Resumo:
With the development of new technologies, Air Traffic Control, in the nearby of the airport, switched from a purely visual control to the use of radar, sensors and so on. As the industry is switching to the so-called Industry 4.0, also in this frame, it would be possible to implement some of the new tools that can facilitate the work of Air Traffic Controllers. The European Union proposed an innovative project to help the digitalization of the European Sky by means of the Single European Sky ATM Research (SESAR) program, which is the foundation on which the Single European Sky (SES) is based, in order to improve the already existing technologies to transform Air Traffic Management in Europe. Within this frame, the Resilient Synthetic Vision for Advanced Control Tower Air Navigation Service Provision (RETINA) project, which saw the light in 2016, studied the possibility to apply new tools within the conventional control tower to reduce the air traffic controller workload, thanks to the improvements in the augmented reality technologies. After the validation of RETINA, the Digital Technologies for Tower (DTT) project was established and the solution proposed by the University of Bologna aimed, among other things, to introduce Safety Nets in a Head-Up visualization. The aim of this thesis is to analyze the Safety Nets in use within the control tower and, by developing a working concept, implement them in a Head-Up view to be tested by Air Traffic Control Operators (ATCOs). The results, coming from the technical test, show that this concept is working and it could be leading to a future implementation in a real environment, as it improves the air traffic controller working conditions also when low visibility conditions apply.
Resumo:
Mainstream programming languages provide built-in exception handling mechanisms to support robust and maintainable implementation of exception handling in software systems. Most of these modern languages, such as C#, Ruby, Python and many others, are often claimed to have more appropriated exception handling mechanisms. They reduce programming constraints on exception handling to favor agile changes in the source code. These languages provide what we call maintenance-driven exception handling mechanisms. It is expected that the adoption of these mechanisms improve software maintainability without hindering software robustness. However, there is still little empirical knowledge about the impact that adopting these mechanisms have on software robustness. This work addresses this gap by conducting an empirical study aimed at understanding the relationship between changes in C# programs and their robustness. In particular, we evaluated how changes in the normal and exceptional code were related to exception handling faults. We applied a change impact analysis and a control flow analysis in 100 versions of 16 C# programs. The results showed that: (i) most of the problems hindering software robustness in those programs are caused by changes in the normal code, (ii) many potential faults were introduced even when improving exception handling in C# code, and (iii) faults are often facilitated by the maintenance-driven flexibility of the exception handling mechanism. Moreover, we present a series of change scenarios that decrease the program robustness
Resumo:
We provide an abstract command language for real-time programs and outline how a partial correctness semantics can be used to compute execution times. The notions of a timed command, refinement of a timed command, the command traversal condition, and the worst-case and best-case execution time of a command are formally introduced and investigated with the help of an underlying weakest liberal precondition semantics. The central result is a theory for the computation of worst-case and best-case execution times from the underlying semantics based on supremum and infimum calculations. The framework is applied to the analysis of a message transmitter program and its implementation. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A flow system designed with solenoid micro-pumps is proposed for fast and greener spectrophotometric determination of free glycerol in biodiesel. Glycerol was extracted from samples without using organic solvents. The determination involves glycerol oxidation by periodate, yielding formaldehyde followed by formation of the colored (3,5-diacetil-1,4-dihidrolutidine) product upon reaction with acetylacetone. The coefficient of variation, sampling rate and detection limit were estimated as 1.5% (20.0 mg L(-1) glycerol, n =10), 34 h(-1), and 1.0 mg L(-1) (99.7% confidence level), respectively. A linear response was observed from 5 to 50 mg L(-1), with reagent consumption estimated as 345 mu g of KIO(4) and 15 mg of acetylacetone per determination. The procedure was successfully applied to the analysis of biodiesel samples and the results agreed with the batch reference method at the 95% confidence level. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The exploitation of aqueous biphasic extraction is proposed for the first time in flow analysis This extraction strategy stands out for being environmentally attractive since it is based in the utilization of two immiscible phases that are intrinsically aqueous The organic solvents of the traditional liquid-liquid extractions ale no longer used, being replaced by non-toxic, non-flammable and non-volatile ones. A single interface flow analysis (SIFA) system was implemented to carry out the extraction process due to its favourable operational characteristics that include the high automation level and simplicity of operation, the establishment of a dynamic interface where the mass transfer occurred between the two immiscible aqueous phases, and the versatile control over the extraction process namely the extraction time The application selected to demonstrate the feasibility of SIFA to perform this aqueous biphasic extraction was the pre-concentration of lead. After extraction, lead reacted with 8-hydroxyquinoline-5-sulfonic acid and the resulting product was determined by a fluorimetric detector included in the flow manifold. Therefore, the SIFA single interface was used both as extraction (enrichment) and reaction interface. (C) 2010 Elsevier B.V All rights reserved.
Resumo:
In this work a downscaled multicommuted flow injection analysis setup for photometric determination is described. The setup consists of a flow system module and a LED based photometer, with a total internal volume of about 170 mu L The system was tested by developing an analytical procedure for the photometric determination of iodate in table salt using N,N-diethyl-henylenediamine (DPD) as the chromogenic reagent. Accuracy was accessed by applying the paired r-test between results obtained using the proposed procedure and a reference method, and no significant difference at the 95% confidence level was observed. Other profitable features, such as a low reagent consumption of 7.3 mu g DPD per determination: a linear response ranging from 0.1 up to 3.0 m IO(3)(-), a relative standard deviation of 0.9% (n = 11) for samples containing 0.5 m IO(3)(-), a detection limit of 17 mu g L(-1) IO(3)(-), a sampling throughput of 117 determination per hour, and a waste generation 600 mu L per determination, were also achieved. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Intensity Modulated Radiotherapy (IMRT) is a technique introduced to shape more precisely the dose distributions to the tumour, providing a higher dose escalation in the volume to irradiate and simultaneously decreasing the dose in the organs at risk which consequently reduces the treatment toxicity. This technique is widely used in prostate and head and neck (H&N) tumours. Given the complexity and the use of high doses in this technique it’s necessary to ensure as a safe and secure administration of the treatment, through the use of quality control programmes for IMRT. The purpose of this study was to evaluate statistically the quality control measurements that are made for the IMRT plans in prostate and H&N patients, before the beginning of the treatment, analysing their variations, the percentage of rejected and repeated measurements, the average, standard deviations and the proportion relations.