911 resultados para ENGINEERING ANALYSIS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is remarkable growing concern about the quality control at the time, which has led to the search for methods capable of addressing effectively the reliability analysis as part of the Statistic. Managers, researchers and Engineers must understand that 'statistical thinking' is not just a set of statistical tools. They should start considering 'statistical thinking' from a 'system', which means, developing systems that meet specific statistical tools and other methodologies for an activity. The aim of this article is to encourage them (engineers, researchers and managers) to develop a new way of thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind farms have been extensively simulated through engineering models for the estimation of wind speed and power deficits inside wind farms. These models were designed initially for a few wind turbines located in flat terrain. Other models based on the parabolic approximation of Navier Stokes equations were developed, making more realistic and feasible the operational resolution of big wind farms in flat terrain and offshore sites. These models have demonstrated to be accurate enough when solving wake effects for this type of environments. Nevertheless, few analyses exist on how complex terrain can affect the behaviour of wind farm wake flow. Recent numerical studies have demonstrated that topographical wakes induce a significant effect on wind turbines wakes, compared to that on flat terrain. This circumstance has recommended the development of elliptic CFD models which allow global simulation of wind turbine wakes in complex terrain. An accurate simplification for the analysis of wind turbine wakes is the actuator disk technique. Coupling this technique with CFD wind models enables the estimation of wind farm wakes preserving the extraction of axial momentum present inside wind farms. This paper describes the analysis and validation of the elliptical wake model CFDWake 1.0 against experimental data from an operating wind farm located in complex terrain. The analysis also reports whether it is possible or not to superimpose linearly the effect of terrain and wind turbine wakes. It also represents one of the first attempts to observe the performance of engineering models compares in large complex terrain wind farms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge of the development of hydrographic networks can be useful for a number of research works in hydraulic engineering. We thus, intend to analyse the cartography regarding the first work that systematically encompasses the entire hydrographic network: Tomas Lopez’s Geographic Atlas of Spain (1787). In order to achieve this goal, we will first analyze –by way of the Geographic Information System (GIS) – both the present and referred historical cartographies. In comparing them, we will use the then-existing population centres that correspond to modern ones. The aim is to compare the following research variables in the hydrographic network: former toponyms, length of riverbeds and distance to population centres. The results of this study will show the variation in the riverbeds and the probable change in their denomination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz–Mancini–Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method to study some neuronal functions, based on the use of the Feynman diagrams, employed in many-body theory, is reported. An equation obtained from the neuron cable theory is the basis for the method. The Green's function for this equation is obtained under some simple boundary conditions. An excitatory signal, with different conditions concerning high and pulse duration, is employed as input signal. Different responses have been obtained

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of probabilistic methods to analyse reliability of structures is being applied to a variety of engineering problems due to the possibility of establishing the failure probability on rational grounds. In this paper we present the application of classical reliability theory to analyse the safety of underground tunnels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FBGs are excellent strain sensors, because of its low size and multiplexing capability. Tens to hundred of sensors may be embedded into a structure, as it has already been demonstrated. Nevertheless, they only afford strain measurements at local points, so unless the damage affects the strain readings in a distinguishable manner, damage will go undetected. This paper show the experimental results obtained on the wing of a UAV, instrumented with 32 FBGs, before and after small damages were introduced. The PCA algorithm was able to distinguish the damage cases, even for small cracks. Principal Component Analysis (PCA) is a technique of multivariable analysis to reduce a complex data set to a lower dimension and reveal some hidden patterns that underlie.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the rationale to build up a Telematics Engineering curriculum. Telematics is a strongly computing oriented area; then, the authors have initially intended to apply the common requirements described in the computing curricula elaborated by the ACM/EEEE-CS Joint Curriculum Task Force. This experience has revealed some problematic aspects in the ACM/IEEE-CS proposal. From the analysis of these problems, a model to guide the selection and specially the approach of the Telematics curriculum contents is proposed. This model can be easily generalized to other strongly computing oriented curricula, whose number is growing everyday

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Systems Engineering (SE in the following) has not received much attention as a subject matter in engineering curricula. There are several dozens of universities around the world offering programs (most of them at the graduate level) on systems science and engineering. However, SE is, per se, rarely found among the courses offered by engineering schools. This observation does not strictly mean that systems concepts be left apart. For example, it is usual to find specialized courses for systems of some particular classes (e.g., courses on software systems engineering for computing curricula) or for particular phases of the system life cycle (e.g., courses on systems analysis). Even so, these kinds of courses tend to over-emphasize the importance of specific methodologies and, in consequence, to deviate the attention from the realm of systernness

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to UN provisions in the period from 2007 to 2050 world population will grow up to 9200 million people. In fact, for the first time in history, in the year 2008 world urban population became higher than rural population. The increase of urban areas and their transport infrastructures has influenced agricultural land use due to their irreversible change, especially when they remain as periurban vacant land, losing their character and identity. In the Europe of the nineties, the traditional urban-rural gradient, characterized by a neat contact between both land types, has become so complex that it has change to a gradient in which it is difficult to separate urban and rural land uses. [Antrop 2004]. A literature review has been made on methodologies used for the urban-rural gradient analysis. One of these methodologies was selected that integrates ecological characterization based on the use of spatial metrics and geographical characterization based on spatial components. Cartographical sources used were Corine Land Cover at 1: 100000 scale and the Spanish Land Use Information System at 1:25000 scale. Urban-rural gradient paradigm is an analysis methodology, coming from landscape ecology, which enables to investigate how urbanization provokes changes in ecological patterns and processes into landscape. [Hahs and McDonnell 2006].The present research adapt this methodology to study the urban-rural gradient in the outskirts of Madrid, Toledo and Guadalajara. Both scales (1:25000 and 1:100000) were simultaneously used to reach the next objectives: 1) Analysis of landscape pattern dynamics in relation to distance to the town centre and major infrastructures. 2) Analysis of landscape pattern dynamics in the fringe of protected areas. The paper presents a new approach to the urban-rural relationship which allows better planning and management of urban áreas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Helium Brayton cycles have been studied as power cycles for both fission and fusion reactors obtaining high thermal efficiency. This paper studies several technological schemes of helium Brayton cycles applied for the HiPER reactor proposal. Since HiPER integrates technologies available at short term, its working conditions results in a very low maximum temperature of the energy sources, something that limits the thermal performance of the cycle. The aim of this work is to analyze the potential of the helium Brayton cycles as power cycles for HiPER. Several helium Brayton cycle configurations have been investigated with the purpose of raising the cycle thermal efficiency under the working conditions of HiPER. The effects of inter-cooling and reheating have specifically been studied. Sensitivity analyses of the key cycle parameters and component performances on the maximum thermal efficiency have also been carried out. The addition of several inter-cooling stages in a helium Brayton cycle has allowed obtaining a maximum thermal efficiency of over 36%, and the inclusion of a reheating process may also yield an added increase of nearly 1 percentage point to reach 37%. These results confirm that helium Brayton cycles are to be considered among the power cycle candidates for HiPER.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years, technical debt has been used as a useful means for making the intrinsic cost of the internal software quality weaknesses visible. This visibility is made possible by quantifying this cost. Specifically, technical debt is expressed in terms of two main concepts: principal and interest. The principal is the cost of eliminating or reducing the impact of a, so called, technical debt item in a software system; whereas the interest is the recurring cost, over a time period, of not eliminating a technical debt item. Previous works about technical debt are mainly focused on estimating principal and interest, and on performing a cost-benefit analysis. This cost-benefit analysis allows one to determine if to remove technical debt is profitable and to prioritize which items incurring in technical debt should be fixed first. Nevertheless, for these previous works technical debt is flat along the time. However the introduction of new factors to estimate technical debt may produce non flat models that allow us to produce more accurate predictions. These factors should be used to estimate principal and interest, and to perform cost-benefit analysis related to technical debt. In this paper, we take a step forward introducing the uncertainty about the interest, and the time frame factors so that it becomes possible to depict a number of possible future scenarios. Estimations obtained without considering the possible evolution of the interest over time may be less accurate as they consider simplistic scenarios without changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demostrated using experimental data obtained on osmotic dehydratation of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses), namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality). Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP) and the Tabular Method (TM), were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work covers the first validation efforts of the EVA Tracking System for the assessment of minimally invasive surgery (MIS) psychomotor skills. Instrument movements were recorded for 42 surgeons (4 expert, 22 residents, 16 novice medical students) and analyzed for a box trainer peg transfer task. Construct validation was established for 7/9 motion analysis parameters (MAPs). Concurrent validation was determined for 8/9 MAPs against the TrEndo Tracking System. Finally, automatic determination of surgical proficiency based on the MAPs was sought by 3 different approaches to supervised classification (LDA, SVM, ANFIS), with accuracy results of 61.9%, 83.3% and 80.9% respectively. Results not only reflect on the validation of EVA for skills? assessment, but also on the relevance of motion analysis of instruments in the determination of surgical competence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laparoscopic instrument tracking systems are a key element in image-guided interventions, which requires high accuracy to be used in a real surgical scenario. In addition, these systems are a suitable option for objective assessment of laparoscopic technical skills based on instrument motion analysis. This study presents a new approach that improves the accuracy of a previously presented system, which applies an optical pose tracking system to laparoscopic practice. A design enhancement of the artificial markers placed on the laparoscopic instrument as well as an improvement of the calibration process are presented as a means to achieve more accurate results. A technical evaluation has been performed in order to compare the accuracy between the previous design and the new approach. Results show a remarkable improvement in the fluctuation error throughout the measurement platform. Moreover, the accumulated distance error and the inclination error have been improved. The tilt range covered by the system is the same for both approaches, from 90º to 7.5º. The relative position error is better for the new approach mainly at close distances to the camera system