863 resultados para set based design
Resumo:
Failure analysis has been, throughout the years, a fundamental tool used in the aerospace sector, supporting assessments performed by sustainment and design engineers mainly related to failure modes and material suitability. The predicted service life of aircrafts often exceeds 40 years, and the design assured life rarely accounts for all in service loads and in service environmental menaces that aging aircrafts must deal with throughout their service lives. From the most conservative safe-life conceptual design approaches to the most recent on-condition based design approaches, assessing the condition and predicting the failure modes of components and materials are essential for the development of adequate preventive and corrective maintenance actions as well as for the accomplishment and optimization of scheduled maintenance programs of aircrafts. Moreover, as the operational conditions of aircrafts may vary significantly from operator to operator (especially in military aircraft), it is necessary to access if the defined maintenance programs are adequate to guarantee the continuous reliability and safe usage of the aircrafts, preventing catastrophic failures which bear significant maintenance and repair costs, and that may lead to the loss of human lives. Thus being, failure analysis and material investigations performed as part of aircraft accidents and incidents investigations arise as powerful tools of the utmost importance for safety assurance and cost reduction within the aeronautical and aerospace sectors. The Portuguese Air Force (PRTAF) has operated different aircrafts throughout its long existence, and in some cases, has operated a particular type of aircraft for more than 30 years, gathering a great amount of expertise in: assessing failure modes of the aircrafts materials; conducting aircrafts accidents and incidents investigations (sometimes with the participation of the aircraft manufacturers and/or other operators); and in the development of design and repair solutions for in-service related problems. This paper addresses several studies to support the thesis that failure analysis plays a key role in flight safety improvement within the PRTAF. It presents a short summary of developed
Resumo:
Antonio Salieri’s La calamita de’ cuori (1774) warrants musicological attention for what it can tell us about Salieri’s compositional craft and what it reveals about the development of form in Viennese Italian-language comic opera of the mid- and late-eighteenth century. In Part I of this dissertation, I explore the performance history of La calamita, present the first plot synopsis and English translation of the libretto, and describe the variants between Carlo Goldoni’s 1752 libretto and the revised version created for Salieri’s opera. I have collated Salieri’s holograph score, Österreichische Nationalbibliothek, Vienna, Mus. Hs. 16.508, with four copies having different relationships to it, and I propose a stemma that represents the relationships between these five sources. The analyses in Part II contribute to our understanding of formal practices in eighteenth-century drammi giocosi. My study of Salieri’s La calamita reveals his reliance on a clearly defined binary structure, referred to in this dissertation as “operatic binary form,” in almost half of the arias, ensembles, and instrumental movements of this opera. Salieri’s consistent use of operatic binary form led me to explore its use in drammi giocosi by other prominent composers of this time, including Baldassare Galuppi’s La calamita de’ cuori (1752), Wolfgang Amadeus Mozart’s Il dissoluto punito, ossia Il Don Giovanni (1787), and selected arias by Pasquale Anfossi, Florian Leopold Gassmann, Giuseppe Gazzaniga, Franz Joseph Haydn, Giovanni Paisiello, and Niccolò Piccinni dating from 1760 to 1774. This study showed that Salieri and his peers adhered to a recognizable tonal plan and set of design elements in their operatic binary forms, and that their arias fall into three distinct categories defined by the tonality at the beginning of the second half of the binary structure. The analysis presented here adds to our present understanding of operatic form in mid- and late-century drammi giocosi and shows that in La calamita de’ cuori, Salieri was following the normative formal procedures of his time.
Resumo:
168 p.
Resumo:
Cassava root is the main staple for 70% of the population in Mozambique, particularly in inaccessible rural areas, but is known to be low in iron. Anaemia is a public health problem in mothers and preschool children in Mozambique and up to 40% of these cases are probably due to dietary iron deficiency. The World Health Organization (WHO) and Food and Agriculture Organization of the United Nations (FAO) recognize the fortification of foodstuff as an effective method to remedy dietary deficiencies of micronutrients, including iron. Cassava mahewu, a non-alcoholic fermented beverage is prepared at subsistence level from cassava roots using indigenous procedures. The aim of the study was to standardize mahewu fermentation and investigate if the type of cassava fermented, or the iron compound used for fortification affected the final product. Roots of sweet and bitter varieties of cassava from four districts (Rapale, Meconta, Alto Molocue and Zavala) in Mozambique, were peeled, dried and pounded to prepare flour. Cassava flour was cooked and fermented under controlled conditions (45°C for 24 h). The fermentation period and temperature were set, based on the findings of a pilot study which showed that an end-point pH of about 4.5 was regularly reached after 24 h at 45°C. Cassava mahewu was fortified with ferrous sulfate (FeSO4.7H2O) or ferrous fumarate (C4H2FeO4) at the beginning (time zero) and at the end of fermentation (24 h). The amount of iron added to the mahewu was based on the average of the approved range of iron used for the fortification of maize meal. The mean pH at the endpoint was 4.5, with 0.29% titratable acidity. The pH and acidity were different to those reported in previous studies on maize mahewu, whereas the solid extract of 9.65% was found to be similar. Lactic acid bacteria (LAB) and yeast growth were not significantly different in mahewu fortified with either of the iron compounds. There was no significant difference between cassava mahewu made from bitter or sweet varieties. A standard method for preparation and iron fortification of cassava mahewu was developed. It is recommended that fortification occurs at the end of fermentation when done at household level.
Resumo:
Frente a una nueva postura no solo dentro del sistema penal ecuatoriano sino en la mayoría de las legislaciones latinoamericanas, con orígenes europeos y norteamericanos, se encuentra una política criminal de agilidad, eficiencia, negociación, eficacia y rapidez, tendiente a solucionar los conflictos penales que a diario se ventilan mediante procedimientos especiales, distintos al procedimiento tradicional llamado Procedimiento Ordinario. Es por ello que el presente trabajo busca analizar y establecer en base al Código Orgánico Integral Penal los procedimientos especiales, particularizando nuestro estudio en el Procedimiento Abreviado, en relación a su normativa, aplicación, efectividad, haciendo un análisis conciso sobre sus antecedentes, naturaleza y sustanciación, sosteniendo en base a principios constitucionales la correcta y adecuada aplicación de éste novedoso procedimiento. Para tal propósito, es necesario dentro del Capítulo I tratar el Proceso Penal y su reseña histórica en el Ecuador seguida por un análisis de los principios constitucionales, para luego, en el Capítulo II hacer referencia a los sujetos procesales que intervienen en el procedimiento penal; el Capítulo lll trata sobre los procedimientos especiales, finalizando en el Capítulo IV con el estudio del Procedimiento Abreviado como tal.
Resumo:
In this report, we develop an intelligent adaptive neuro-fuzzy controller by using adaptive neuro fuzzy inference system (ANFIS) techniques. We begin by starting with a standard proportional-derivative (PD) controller and use the PD controller data to train the ANFIS system to develop a fuzzy controller. We then propose and validate a method to implement this control strategy on commercial off-the-shelf (COTS) hardware. An analysis is made into the choice of filters for attitude estimation. These choices are limited by the complexity of the filter and the computing ability and memory constraints of the micro-controller. Simplified Kalman filters are found to be good at estimation of attitude given the above constraints. Using model based design techniques, the models are implemented on an embedded system. This enables the deployment of fuzzy controllers on enthusiast-grade controllers. We evaluate the feasibility of the proposed control strategy in a model-in-the-loop simulation. We then propose a rapid prototyping strategy, allowing us to deploy these control algorithms on a system consisting of a combination of an ARM-based microcontroller and two Arduino-based controllers. We then use a combination of the code generation capabilities within MATLAB/Simulink in combination with multiple open-source projects in order to deploy code to an ARM CortexM4 based controller board. We also evaluate this strategy on an ARM-A8 based board, and a much less powerful Arduino based flight controller. We conclude by proving the feasibility of fuzzy controllers on Commercial-off the shelf (COTS) hardware, we also point out the limitations in the current hardware and make suggestions for hardware that we think would be better suited for memory heavy controllers.
Resumo:
We propose and demonstrate, for the first time to our best knowledge, the use of a 45° tilted fiber grating (TFG) as an infiber lateral diffraction element in an efficient and fiber-compatible spectrally encoded imaging (SEI) system. Under proper polarization control, the TFG has significantly enhanced diffraction efficiency (93.5%) due to strong tilted reflection. Our conceptually new fiber-topics-based design eliminates the need for bulky and lossy free-space diffraction gratings, significantly reduces the volume and cost of the imaging system, improves energy efficiency, and increases system stability. As a proof-of-principle experiment, we use the proposed system to perform an one dimensional (1D) line scan imaging of a customer-designed three-slot sample and the results show that the constructed image matches well with the actual sample. The angular dispersion of the 45° TFG is measured to be 0.054°/nm and the lateral resolution of the SEI system is measured to be 28 μm in our experiment.
Resumo:
The physical environment can influence older people’s health and well-being, and is often mentioned as being an important factor for person-centred care. Due to high levels of frail health, many older people spend a majority of their time within care facilities and depend on the physical environment for support in their daily life. However, the quality of the physical environment is rarely evaluated, and knowledge is sparse in terms of how well the environment meets the needs of older people. This is partly due to the lack of valid and reliable instruments that could provide important information on environmental quality. Aim: The aim of this thesis was to study the quality of the physical environment in Swedish care facilities for older people, and how it relates to residents’ activities and well-being. Methods: The thesis comprises four papers where both qualitative and quantitative methods were used. Study I involved the translation and adaptation of the Sheffield Care Environment Assessment Matrix (SCEAM) into a Swedish version (S-SCEAM). Several methods were used including forward and backward translation, test of validity via expert consultation and reliability tests. In Study II, S-SCEAM was used to assess the quality of the environment, and descriptive data were collected from 20 purposively sampled residential care facilities (RCFs). Study III was a comparative case study conducted at two RCFs using observations, interviews and S-SCEAM to examine how the physical environment relates to older people’s activities and interactions. In study IV, multilevel modeling was used to determine the association between the quality of the physical environment and the psychological and social well-being of older people living in RCFs. The data in the thesis were analysed using qualitative content analysis, and descriptive, bivariate and multilevel statistics. Results: A specific result was the production of the Swedish version of SCEAM. The instrument contains 210 items structured into eight domains reflecting the needs of older people. When using S-SCEAM, the results showed a substantial variation in the quality of the physical environment between and within RCFs. In general, private apartments and dining areas had high quality, whereas overall building layout and outdoor areas had lower quality. Also, older people’s safety was supported in the majority of facilities, whereas cognitive support and privacy had lower quality. Further, the results showed that environmental quality in terms of cognitive support was associated with residents’ social well-being. Specific environmental features, such as building design and space size, were also noted, through observation, as influencing residents’ activities, and several barriers were found that seemed to restrict residents’ full use of the environment. Conclusions: This thesis contributes to the growing evidence-based design field. The S-SCEAM can be used in future research on the association between the environment and people’s health and well-being. The instrument could also serve as a guide in the planning and design process of new RCFs.
Resumo:
Les techniques des directions d’arrivée (DOA) sont une voie prometteuse pour accroitre la capacité des systèmes et les services de télécommunications en permettant de mieux estimer le canal radio-mobile. Elles permettent aussi de suivre précisément des usagers cellulaires pour orienter les faisceaux d’antennes dans leur direction. S’inscrivant dans ce contexte, ce présent mémoire décrit étape par étape l’implémentation de l’algorithme de haut niveau MUSIC (MUltiple SIgnal Classification) sur une plateforme FPGA afin de déterminer en temps réel l’angle d’arrivée d’une ou des sources incidentes à un réseau d’antennes. Le concept du prototypage rapide des lois de commande (RCP) avec les outils de XilinxTM System generator (XSG) et du MBDK (Model Based Design Kit) de NutaqTM est le concept de développement utilisé. Ce concept se base sur une programmation de code haut niveau à travers des modèles, pour générer automatiquement un code de bas niveau. Une attention particulière est portée sur la méthode choisie pour résoudre le problème de la décomposition en valeurs et vecteurs propres de la matrice complexe de covariance par l’algorithme de Jacobi. L’architecture mise en place implémentant cette dernière dans le FPGA (Field Programmable Gate Array) est détaillée. Par ailleurs, il est prouvé que MUSIC ne peut effectuer une estimation intéressante de la position des sources sans une calibration préalable du réseau d’antennes. Ainsi, la technique de calibration par matrice G utilisée dans ce projet est présentée, en plus de son modèle d’implémentation. Enfin, les résultats expérimentaux du système mis à l’épreuve dans un environnement réel en présence d’une source puis de deux sources fortement corrélées sont illustrés et analysés.
Resumo:
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.
Resumo:
Nowadays, product development in all its phases plays a fundamental role in the industrial chain. The need for a company to compete at high levels, the need to be quick in responding to market demands and therefore to be able to engineer the product quickly and with a high level of quality, has led to the need to get involved in new more advanced methods/ processes. In recent years, we are moving away from the concept of 2D-based design and production and approaching the concept of Model Based Definition. By using this approach, increasingly complex systems turn out to be easier to deal with but above all cheaper in obtaining them. Thanks to the Model Based Definition it is possible to share data in a lean and simple way to the entire engineering and production chain of the product. The great advantage of this approach is precisely the uniqueness of the information. In this specific thesis work, this approach has been exploited in the context of tolerances with the aid of CAD / CAT software. Tolerance analysis or dimensional variation analysis is a way to understand how sources of variation in part size and assembly constraints propagate between parts and assemblies and how that range affects the ability of a project to meet its requirements. It is critically important to note how tolerance directly affects the cost and performance of products. Worst Case Analysis (WCA) and Statistical analysis (RSS) are the two principal methods in DVA. The thesis aims to show the advantages of using statistical dimensional analysis by creating and examining various case studies, using PTC CREO software for CAD modeling and CETOL 6σ for tolerance analysis. Moreover, it will be provided a comparison between manual and 3D analysis, focusing the attention to the information lost in the 1D case. The results obtained allow us to highlight the need to use this approach from the early stages of the product design cycle.
Resumo:
A design methodology for monolithic integration of inductor based DC-DC converters is proposed in this paper. A power loss model of the power stage, including the drive circuits, is defined in order to optimize efficiency. Based on this model and taking as reference a 0.35 mu m CMOS process, a buck converter was designed and fabricated. For a given set of operating conditions the defined power loss model allows to optimize the design parameters for the power stage, including the gate-driver tapering factor and the width of the power MOSFETs. Experimental results obtained from a buck converter at 100 MHz switching frequency are presented to validate the proposed methodology.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
B2B document handling is moving from paper to electronic networks and electronic domain very rapidly. Moving, handling and transforming large electronic business documents requires a lot from the systems handling them. This paper explores new technologies such as SOA, event-driven systems and ESB and a scalable, event-driven enterprise service bus is created to demonstrate these new approaches to message handling. As an end result, we have a small but fully functional messaging system with several different components. This is the first larger Java-project done in-house, so on the side we developed our own set of best practices of Java development, setting up configurations, tools, code repositories and class naming and much more.
Resumo:
Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.