990 resultados para Last planner system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last few years, several methods have been proposed in order to study and to evaluate characteristic properties of the human skin by using non-invasive approaches. Mostly, these methods cover aspects related to either dermatology, to analyze skin physiology and to evaluate the effectiveness of medical treatments in skin diseases, or dermocosmetics and cosmetic science to evaluate, for example, the effectiveness of anti-aging treatments. To these purposes a routine approach must be followed. Although very accurate and high resolution measurements can be achieved by using conventional methods, such as optical or mechanical profilometry for example, their use is quite limited primarily to the high cost of the instrumentation required, which in turn is usually cumbersome, highlighting some of the limitations for a routine based analysis. This thesis aims to investigate the feasibility of a noninvasive skin characterization system based on the analysis of capacitive images of the skin surface. The system relies on a CMOS portable capacitive device which gives 50 micron/pixel resolution capacitance map of the skin micro-relief. In order to extract characteristic features of the skin topography, image analysis techniques, such as watershed segmentation and wavelet analysis, have been used to detect the main structures of interest: wrinkles and plateau of the typical micro-relief pattern. In order to validate the method, the features extracted from a dataset of skin capacitive images acquired during dermatological examinations of a healthy group of volunteers have been compared with the age of the subjects involved, showing good correlation with the skin ageing effect. Detailed analysis of the output of the capacitive sensor compared with optical profilometry of silicone replica of the same skin area has revealed potentiality and some limitations of this technology. Also, applications to follow-up studies, as needed to objectively evaluate the effectiveness of treatments in a routine manner, are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preparation of conformationally hindered molecules and their study by DNMR and computational methods are my thesis’s core. In the first chapter, the conformations and the stereodynamics of symmetrically ortho-disubstituted aryl carbinols and aryl ethers are described. In the second chapter, the structures of axially chiral atropisomers of hindered biphenyl carbinols are studied. In the third chapter, the steric barriers and the -barrier of 1,8-di-aylbiphenylenes are determined. Interesting atropisomers are found in the cases of arylanthrones, arylanthraquinones and arylanthracenes and are reported in the fourth chapter. By the combined use of dynamic NMR, ECD spectroscopy and DFT computations, the conformations and the absolute configurations of 2-Naphthylalkylsulfoxides are studied in the fifth chapter. In the last chapter, a new synthetic route to ,’-arylated secondary or tertiary alcohols by lithiated O-benzyl-carbamates carrying an N-aryl substituent and DFT calculations to determinate the cyclic intermediate are reported. This work was done in the research group of Prof. Jonathan Clayden, at the University of Manchester.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the recent decade, the request for structural health monitoring expertise increased exponentially in the United States. The aging issues that most of the transportation structures are experiencing can put in serious jeopardy the economic system of a region as well as of a country. At the same time, the monitoring of structures is a central topic of discussion in Europe, where the preservation of historical buildings has been addressed over the last four centuries. More recently, various concerns arose about security performance of civil structures after tragic events such the 9/11 or the 2011 Japan earthquake: engineers looks for a design able to resist exceptional loadings due to earthquakes, hurricanes and terrorist attacks. After events of such a kind, the assessment of the remaining life of the structure is at least as important as the initial performance design. Consequently, it appears very clear that the introduction of reliable and accessible damage assessment techniques is crucial for the localization of issues and for a correct and immediate rehabilitation. The System Identification is a branch of the more general Control Theory. In Civil Engineering, this field addresses the techniques needed to find mechanical characteristics as the stiffness or the mass starting from the signals captured by sensors. The objective of the Dynamic Structural Identification (DSI) is to define, starting from experimental measurements, the modal fundamental parameters of a generic structure in order to characterize, via a mathematical model, the dynamic behavior. The knowledge of these parameters is helpful in the Model Updating procedure, that permits to define corrected theoretical models through experimental validation. The main aim of this technique is to minimize the differences between the theoretical model results and in situ measurements of dynamic data. Therefore, the new model becomes a very effective control practice when it comes to rehabilitation of structures or damage assessment. The instrumentation of a whole structure is an unfeasible procedure sometimes because of the high cost involved or, sometimes, because it’s not possible to physically reach each point of the structure. Therefore, numerous scholars have been trying to address this problem. In general two are the main involved methods. Since the limited number of sensors, in a first case, it’s possible to gather time histories only for some locations, then to move the instruments to another location and replay the procedure. Otherwise, if the number of sensors is enough and the structure does not present a complicate geometry, it’s usually sufficient to detect only the principal first modes. This two problems are well presented in the works of Balsamo [1] for the application to a simple system and Jun [2] for the analysis of system with a limited number of sensors. Once the system identification has been carried, it is possible to access the actual system characteristics. A frequent practice is to create an updated FEM model and assess whether the structure fulfills or not the requested functions. Once again the objective of this work is to present a general methodology to analyze big structure using a limited number of instrumentation and at the same time, obtaining the most information about an identified structure without recalling methodologies of difficult interpretation. A general framework of the state space identification procedure via OKID/ERA algorithm is developed and implemented in Matlab. Then, some simple examples are proposed to highlight the principal characteristics and advantage of this methodology. A new algebraic manipulation for a prolific use of substructuring results is developed and implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zusammenfassung: Freizeitaktivitäten wie Mountainbiking, Sportklettern, Drachen- und Gleitschirmfliegen oder Windsur-fen sind relativ neue Erscheinungen der letzten fünfzehn bis fünfundzwanzig Jahre. Ihre Entwicklung und Ausdifferenzierung beschränkt sich vor allem auf Gesellschaften westlich-industrieller Prägung, deren Grundlage ein funktional differenziertes Gesellschaftssystem ist. Mit Hilfe des Beobachtungsin-strumentes der neueren soziologischen Systemtheorie wird zunächst das Aufkommen und die immer fortschreitende Ausdifferenzierung von Trend- und Natursportarten analysiert und gedeutet. Es zeigt sich, daß diese neuen Freizeitaktivitäten über den Zugriff auf die eigene Körperlichkeit in hohem Ma-ße zur Identitätsfindung der Individuen solcher Gesellschaftssysteme beitragen.Aus geographischer Sicht stellt sich die Frage nach der Bedeutung dieser Entwicklung für die Land-schaft. Dabei wird Landschaft explizit nicht als etwas verstanden, daß 'an sich' existiert, sondern als etwas, das 'erlebt' oder 'begriffen', also immer wieder aufs Neue im Auge des Betrachters konstru-iert wird. Für die Sporttreibenden von Trend- und Natursportarten ist der Blick auf die Welt durch ihre Sportart strukturiert. Jedenfalls immer dann, wenn es darum geht, eine Sportstätte für ihre jeweilige Sportausübung auszuwählen. Die Naturlandschaft (oder auch Kulturlandschaft) wird damit zur 'Sportlandschaft'. Der Begriff der 'Landschaft' erscheint dann im Sinne Werlens 'alltäglichen Geo-graphie-Machens' als eine permanent neu zu erstellende und damit sich auch ständig verändernde Perspektive auf die Objektwelt. So kann sich im Auge der Sporttreibenden von Trend- und Natur-sportarten jedes Objekt und jede Landschaft, wenn man so will: die Welt, als ein einziger großer Sportplatz darstellen.Wenn also alles potentiell eine Sportstätte sein kann, stellt sich die Frage, wie es zur Herausbildung sogenannter Top Spots kommt, also zu jenen Sportstätten, die innerhalb der jeweiligen Sportart von herausragender Bedeutung sind - sozusagen die oberste Sprosse der Karriereleiter erklommen haben - und an der jeder gewesen sein muß, der innerhalb der sportartspezifischen Szene etwas gelten will. Dieser Frage wird anhand eines Fallbeispiels nachgegangen. Als Grundlage wurde ein Untersu-chungsgebiet gewählt, das sich bereits in einem fortgeschrittenen Stadium der Entwicklung zu einem Top Spot befindet - eine Landschaft also, die bereits eine 'Karriere als Sportlandschaft' aufzuweisen hat. Die peripher gelegene US-amerikanische Kleinstadt Moab in Südost-Utah (ca. 5.000 Einwohner) hat sich seit Anfang der 1990er Jahre zu einem internationalen Szene-Treffpunkt für Mountainbiking entwickelt. Zu dem wichtigsten Mountainbike-Trail (dem Slickrock Bike Trail) kommen pro Jahr al-lein mehr als 200.000 Mountainbike-TouristInnen. Neben dem Mountainbiking spielt River Rafting eine bedeutende Rolle im Tourismus und es hat sich dort in den letzten Jahren außerdem eine kleine Sportkletter-Szene etabliert, die in den nächsten Jahren sicherlich noch an Bedeutung zunehmen wird.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flicker is a power quality phenomenon that applies to cycle instability of light intensity resulting from supply voltage fluctuation, which, in turn can be caused by disturbances introduced during power generation, transmission or distribution. The standard EN 61000-4-15 which has been recently adopted also by the IEEE as IEEE Standard 1453 relies on the analysis of the supply voltage which is processed according to a suitable model of the lamp – human eye – brain chain. As for the lamp, an incandescent 60 W, 230 V, 50 Hz source is assumed. As far as the human eye – brain model is concerned, it is represented by the so-called flicker curve. Such a curve was determined several years ago by statistically analyzing the results of tests where people were subjected to flicker with different combinations of magnitude and frequency. The limitations of this standard approach to flicker evaluation are essentially two. First, the provided index of annoyance Pst can be related to an actual tiredness of the human visual system only if such an incandescent lamp is used. Moreover, the implemented response to flicker is “subjective” given that it relies on the people answers about their feelings. In the last 15 years, many scientific contributions have tackled these issues by investigating the possibility to develop a novel model of the eye-brain response to flicker and overcome the strict dependence of the standard on the kind of the light source. In this light of fact, this thesis is aimed at presenting an important contribution for a new Flickermeter. An improved visual system model using a physiological parameter that is the mean value of the pupil diameter, has been presented, thus allowing to get a more “objective” representation of the response to flicker. The system used to both generate flicker and measure the pupil diameter has been illustrated along with all the results of several experiments performed on the volunteers. The intent has been to demonstrate that the measurement of that geometrical parameter can give reliable information about the feeling of the human visual system to light flicker.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the ways by which the legal system has responded to different sets of problems is the blurring of the traditional boundaries of criminal law, both procedural and substantive. This study aims to explore under what conditions does this trend lead to the improvement of society's welfare by focusing on two distinguishing sanctions in criminal law, incarceration and social stigma. In analyzing how incarceration affects the incentive to an individual to violate a legal standard, we considered the crucial role of the time constraint. This aspect has not been fully explored in the literature on law and economics, especially with respect to the analysis of the beneficiality of imposing either a fine or a prison term. We observed that that when individuals are heterogeneous with respect to wealth and wage income, and when the level of activity can be considered a normal good, only the middle wage and middle income groups can be adequately deterred by a fixed fines alone regime. The existing literature only considers the case of the very poor, deemed as judgment proof. However, since imprisonment is a socially costly way to deprive individuals of their time, other alternatives may be sought such as the imposition of discriminatory monetary fine, partial incapacitation and other alternative sanctions. According to traditional legal theory, the reason why criminal law is obeyed is not mainly due to the monetary sanctions but to the stigma arising from the community’s moral condemnation that accompanies conviction or merely suspicion. However, it is not sufficiently clear whether social stigma always accompanies a criminal conviction. We addressed this issue by identifying the circumstances wherein a criminal conviction carries an additional social stigma. Our results show that social stigma is seen to accompany a conviction under the following conditions: first, when the law coincides with the society's social norms; and second, when the prohibited act provides information on an unobservable attribute or trait of an individual -- crucial in establishing or maintaining social relationships beyond mere economic relationships. Thus, even if the social planner does not impose the social sanction directly, the impact of social stigma can still be influenced by the probability of conviction and the level of the monetary fine imposed as well as the varying degree of correlation between the legal standard violated and the social traits or attributes of the individual. In this respect, criminal law serves as an institution that facilitates cognitive efficiency in the process of imposing the social sanction to the extent that the rest of society is boundedly rational and use judgment heuristics. Paradoxically, using criminal law in order to invoke stigma for the violation of a legal standard may also serve to undermine its strength. To sum, the results of our analysis reveal that the scope of criminal law is narrow both for the purposes of deterrence and cognitive efficiency. While there are certain conditions where the enforcement of criminal law may lead to an increase in social welfare, particularly with respect to incarceration and stigma, we have also identified the channels through which they could affect behavior. Since such mechanisms can be replicated in less costly ways, society should first try or seek to employ these legal institutions before turning to criminal law as a last resort.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subject of this thesis is the development of a Gaschromatography (GC) system for non-methane hydrocarbons (NMHCs) and measurement of samples within the project CARIBIC (Civil Aircraft for the Regular Investigation of the atmosphere Based on an Instrument Container, www.caribic-atmospheric.com). Air samples collected at cruising altitude from the upper troposphere and lowermost stratosphere contain hydrocarbons at low levels (ppt range), which imposes substantial demands on detection limits. Full automation enabled to maintain constant conditions during the sample processing and analyses. Additionally, automation allows overnight operation thus saving time. A gas chromatography using flame ionization detection (FID) together with the dual column approach enables simultaneous detection with almost equal carbon atom response for all hydrocarbons except for ethyne. The first part of this thesis presents the technical descriptions of individual parts of the analytical system. Apart from the sample treatment and calibration procedures, the sample collector is described. The second part deals with analytical performance of the GC system by discussing tests that had been made. Finally, results for measurement flight are assessed in terms of quality of the data and two flights are discussed in detail. Analytical performance is characterized using detection limits for each compound, using uncertainties for each compound, using tests of calibration mixture conditioning and carbon dioxide trap to find out their influence on analyses, and finally by comparing the responses of calibrated substances during period when analyses of the flights were made. Comparison of both systems shows good agreement. However, because of insufficient capacity of the CO2 trap the signal of one column was suppressed due to breakthroughed carbon dioxide so much that its results appeared to be unreliable. Plausibility tests for the internal consistency of the given data sets are based on common patterns exhibited by tropospheric NMHCs. All tests show that samples from the first flights do not comply with the expected pattern. Additionally, detected alkene artefacts suggest potential problems with storing or contamination within all measurement flights. Two last flights # 130-133 and # 166-169 comply with the tests therefore their detailed analysis is made. Samples were analyzed in terms of their origin (troposphere vs. stratosphere, backward trajectories), their aging (NMHCs ratios) and detected plumes were compared to chemical signatures of Asian outflows. In the last chapter a future development of the presented system with focus on separation is drawn. An extensive appendix documents all important aspects of the dissertation from theoretical introduction through illustration of sample treatment to overview diagrams for the measured flights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parkinson’s disease is a neurodegenerative disorder due to the death of the dopaminergic neurons of the substantia nigra of the basal ganglia. The process that leads to these neural alterations is still unknown. Parkinson’s disease affects most of all the motor sphere, with a wide array of impairment such as bradykinesia, akinesia, tremor, postural instability and singular phenomena such as freezing of gait. Moreover, in the last few years the fact that the degeneration in the basal ganglia circuitry induces not only motor but also cognitive alterations, not necessarily implicating dementia, and that dopamine loss induces also further implications due to dopamine-driven synaptic plasticity got more attention. At the present moment, no neuroprotective treatment is available, and even if dopamine-replacement therapies as well as electrical deep brain stimulation are able to improve the life conditions of the patients, they often present side effects on the long term, and cannot recover the neural loss, which instead continues to advance. In the present thesis both motor and cognitive aspects of Parkinson’s disease and basal ganglia circuitry were investigated, at first focusing on Parkinson’s disease sensory and balance issues by means of a new instrumented method based on inertial sensor to provide further information about postural control and postural strategies used to attain balance, then applying this newly developed approach to assess balance control in mild and severe patients, both ON and OFF levodopa replacement. Given the inability of levodopa to recover balance issues and the new physiological findings than underline the importance in Parkinson’s disease of non-dopaminergic neurotransmitters, it was therefore developed an original computational model focusing on acetylcholine, the most promising neurotransmitter according to physiology, and its role in synaptic plasticity. The rationale of this thesis is that a multidisciplinary approach could gain insight into Parkinson’s disease features still unresolved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zusammenfassung: Primo Levis 1975 publiziertes Buch Il sistema periodico (Das periodische System, deutsch 1986) wurde 2006 bei einer Befragung des Imperial College in London zum „best science book ever“ gekürt. Zweifellos gehört Levis Buch zu den berühmtesten literarischen Werken mit chemischem Inhalt. Wohl jeder Chemiker kennt es oder hat schon davon gehört. Es gliedert sich in 21 Element-Erzählungen, die berühmteste von diesen ist die letzte, die vom Kohlenstoff handelt. Im Folgenden versuche ich, den Nachweis zu führen, dass diese Geschichte eine Vorlage hatte, nämlich die Erzählung ‚Lebensgeschichte eines Kohlenstoffatoms‘ von Hermann Römpp, die 1946 als Kosmos-Bändchen bei Franckh erschien. Hermann Römpp, der während des ‚Dritten Reiches‘ die antisemitischen und eugenischen Maßnahmen der Nationalsozialisten in einigen Schriften gepriesen und gerechtfertigt hatte, publizierte die Geschichte als einzige seines Oeuvres nicht unter eigenem Namen, sondern unter dem Pseudonym „Dr. Helmut Schmid“. rnDer Nachweis, dass Römpps Geschichte eine Vorlage für Levi war, soll eine vertiefte Lektüre von Levis Erzählung ermöglichen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most serious problems of the modern medicine is the growing emergence of antibiotic resistance among pathogenic bacteria. In this circumstance, different and innovative approaches for treating infections caused by multidrug-resistant bacteria are imperatively required. Bacteriophage Therapy is one among the fascinating approaches to be taken into account. This consists of the use of bacteriophages, viruses that infect bacteria, in order to defeat specific bacterial pathogens. Phage therapy is not an innovative idea, indeed, it was widely used around the world in the 1930s and 1940s, in order to treat various infection diseases, and it is still used in Eastern Europe and the former Soviet Union. Nevertheless, Western scientists mostly lost interest in further use and study of phage therapy and abandoned it after the discovery and the spread of antibiotics. The advancement of scientific knowledge of the last years, together with the encouraging results from recent animal studies using phages to treat bacterial infections, and above all the urgent need for novel and effective antimicrobials, have given a prompt for additional rigorous researches in this field. In particular, in the laboratory of synthetic biology of the department of Life Sciences at the University of Warwick, a novel approach was adopted, starting from the original concept of phage therapy, in order to study a concrete alternative to antibiotics. The innovative idea of the project consists in the development of experimental methodologies, which allow to engineer a programmable synthetic phage system using a combination of directed evolution, automation and microfluidics. The main aim is to make “the therapeutics of tomorrow individualized, specific, and self-regulated” (Jaramillo, 2015). In this context, one of the most important key points is the Bacteriophage Quantification. Therefore, in this research work, a mathematical model describing complex dynamics occurring in biological systems involving continuous growth of bacteriophages, modulated by the performance of the host organisms, was implemented as algorithms into a working software using MATLAB. The developed program is able to predict different unknown concentrations of phages much faster than the classical overnight Plaque Assay. What is more, it gives a meaning and an explanation to the obtained data, making inference about the parameter set of the model, that are representative of the bacteriophage-host interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safe operation of nighttime flight missions would be enhanced using Night Vision Imaging Systems (NVIS) equipment. This has been clear to the military since 1970s and to the civil helicopters since 1990s. In these last months, even Italian Emergency Medical Service (EMS) operators require Night Vision Goggles (NVG) devices that therefore amplify the ambient light. In order to fly with this technology, helicopters have to be NVIS-approved. The author have supported a company, to quantify the potentiality of undertaking the certification activity, through a feasibility study. Even before, NVG description and working principles have been done, then specifications analysis about the processes to make a helicopter NVIS-approved has been addressed. The noteworthy difference between military specifications and the civilian ones highlights non-irrevelant lacks in the latter. The activity of NVIS certification could be a good investment because the following targets have been achieved: Reductions of the certification cost, of the operating time and of the number of non-compliance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until today the role of oxygen in the development of the fetus remains controversially discussed. It is still believed that lack of oxygen in utero might be responsible for some of the known congenital cardiovascular malformations. Over the last two decades detailed research has given us new insights and a better understanding of embryogenesis and fetal growth. But most importantly it has repeatedly demonstrated that oxygen only plays a minor role in the early intrauterine development. After organogenesis has taken place hypoxia becomes more important during the second and third trimester of pregnancy when fetal growth occurs. This review will briefly adress causes and mechanisms leading to intrauterine hypoxia and their impact on the fetal cardiovascular system.