931 resultados para device failure analysis
Resumo:
An n(++)-GaAs/p(++)-AlGaAs tunnel junction with a peak current density of 10 100Acm(-2) is developed. This device is a tunnel junction for multijunction solar cells, grown lattice-matched on standard GaAs or Ge substrates, with the highest peak current density ever reported. The voltage drop for a current density equivalent to the operation of the multijunction solar cell up to 10 000 suns is below 5 mV. Trap-assisted tunnelling is proposed to be behind this performance, which cannot be justified by simple band-to-band tunnelling. The metal-organic vapour-phase epitaxy growth conditions, which are in the limits of the transport-limited regime, and the heavy tellurium doping levels are the proposed origins of the defects enabling trap-assisted tunnelling. The hypothesis of trap-assisted tunnelling is supported by the observed annealing behaviour of the tunnel junctions, which cannot be explained in terms of dopant diffusion or passivation. For the integration of these tunnel junctions into a triple-junction solar cell, AlGaAs barrier layers are introduced to suppress the formation of parasitic junctions, but this is found to significantly degrade the performance of the tunnel junctions. However, the annealed tunnel junctions with barrier layers still exhibit a peak current density higher than 2500Acm(-2) and a voltage drop at 10 000 suns of around 20 mV, which are excellent properties for tunnel junctions and mean they can serve as low-loss interconnections in multijunction solar cells working at ultra-high concentrations.
Resumo:
The failure locus, the characteristics of the stress–strain curve and the damage localization patterns were analyzed in a polypropylene nonwoven fabric under in-plane biaxial deformation. The analysis was carried out by means of a homogenization model developed within the context of the finite element method. It provides the constitutive response for a mesodomain of the fabric corresponding to the area associated to a finite element and takes into account the main deformation and damage mechanisms experimentally observed. It was found that the failure locus in the stress space was accurately predicted by the Von Mises criterion and failure took place by the localization of damage into a crack perpendicular to the main loading axis.
Resumo:
The direct application of existing models for seed germination may often be inadequate in the context of ecology and forestry germination experiments. This is because basic model assumptions are violated and variables available to forest managers are rarely used. In this paper, we present a method which addresses the aforementioned shortcomings. The approach is illustrated through a case study of Pinus pinea L. Our findings will also shed light on the role of germination in the general failure of natural regeneration in managed forests of this species. The presented technique consists of a mixed regression model based on survival analysis. Climate and stand covariates were tested. Data for fitting the model were gathered from a 5-year germination experiment in a mature, managed P. pinea stand in the Northern Plateau of Spain in which two different stand densities can be found. The model predictions proved to be unbiased and highly accurate when compared with the training data. Germination in P. pinea was controlled through thermal variables at stand level. At microsite level, low densities negatively affected the probability of germination. A time-lag in the response was also detected. Overall, the proposed technique provides a reliable alternative to germination modelling in ecology/forestry studies by using accessible/ suitable variables. The P. pinea case study highlights the importance of producing unbiased predictions. In this species, the occurrence and timing of germination suggest a very different regeneration strategy from that understood by forest managers until now, which may explain the high failure rate of natural regeneration in managed stands. In addition, these findings provide valuable information for the management of P. pinea under climate-change conditions.
Resumo:
Critical infrastructures support everyday activities in modern societies, facilitating the exchange of services and quantities of various nature. Their functioning is the result of the integration of diverse technologies, systems and organizations into a complex network of interconnections. Benefits from networking are accompanied by new threats and risks. In particular, because of the increased interdependency, disturbances and failures may propagate and render unstable the whole infrastructure network. This paper presents a methodology of resilience analysis of networked systems of systems. Resilience generalizes the concept of stability of a system around a state of equilibrium, with respect to a disturbance and its ability of preventing, resisting and recovery. The methodology provides a tool for the analysis of off-equilibrium conditions that may occur in a single system and propagate through the network of dependencies. The analysis is conducted in two stages. The first stage of the analysis is qualitative. It identifies the resilience scenarios, i.e. the sequence of events, triggered by an initial disturbance, which include failures and the system response. The second stage is quantitative. The most critical scenarios can be simulated, for the desired parameter settings, in order to check if they are successfully handled, i.e recovered to nominal conditions, or they end into the network failure. The proposed methodology aims at providing an effective support to resilience-informed design.
Resumo:
In the last years significant efforts have been devoted to the development of advanced data analysis tools to both predict the occurrence of disruptions and to investigate the operational spaces of devices, with the long term goal of advancing the understanding of the physics of these events and to prepare for ITER. On JET the latest generation of the disruption predictor called APODIS has been deployed in the real time network during the last campaigns with the new metallic wall. Even if it was trained only with discharges with the carbon wall, it has reached very good performance, with both missed alarms and false alarms in the order of a few percent (and strategies to improve the performance have already been identified). Since for the optimisation of the mitigation measures, predicting also the type of disruption is considered to be also very important, a new clustering method, based on the geodesic distance on a probabilistic manifold, has been developed. This technique allows automatic classification of an incoming disruption with a success rate of better than 85%. Various other manifold learning tools, particularly Principal Component Analysis and Self Organised Maps, are also producing very interesting results in the comparative analysis of JET and ASDEX Upgrade (AUG) operational spaces, on the route to developing predictors capable of extrapolating from one device to another.
Resumo:
The road to the automation of the agricultural processes passes through the safe operation of the autonomous vehicles. This requirement is a fact in ground mobile units, but it still has not well defined for the aerial robots (UAVs) mainly because the normative and legislation are quite diffuse or even inexistent. Therefore, to define a common and global policy is the challenge to tackle. This characterization has to be addressed from the field experience. Accordingly, this paper presents the work done in this direction, based on the analysis of the most common sources of hazards when using UAV's for agricultural tasks. The work, based on the ISO 31000 normative, has been carried out by applying a three-step structure that integrates the identification, assessment and reduction procedures. The present paper exposes how this method has been applied to analyze previous accidents and malfunctions during UAV operations in order to obtain real failure causes. It has allowed highlighting common risks and hazardous sources and proposing specific guards and safety measures for the agricultural context.
Resumo:
A reliability analysis method is proposed that starts with the identification of all variables involved. These are divided in three groups: (a) variables fixed by codes, as loads and strength project values, and their corresponding partial safety coefficients, (b) geometric variables defining the dimension of the main elements involved, (c) the cost variables, including the possible damages caused by failure, (d) the random variables as loads, strength, etc., and (e)the variables defining the statistical model, as the family of distribution and its corresponding parameters. Once the variables are known, the II-theorem is used to obtain a minimum equivalent set of non-dimensional variables, which is used to define the limit states. This allows a reduction in the number of variables involved and a better understanding of their coupling effects. Two minimum cost criteria are used for selecting the project dimensions. One is based on a bounded-probability of failure, and the other on a total cost, including the damages of the possible failure. Finally, the method is illustrated by means of an application.
Resumo:
Seismic hazard study in “La Hispaniola” island in connection with the land tenure situation in the region, in order to define priority areas with a high risk, where some land management recommendations are proposed. The seismic hazard assessment has been carried out following the probabilistic method with a seismogenic zonation and including the major faults of the region as independent units. In order to identify the priority areas, it has taken into account, besides the seismic hazard study, the map of changes of static Coulomb failure stress and the landslide hazard map.
Resumo:
The use of probabilistic methods to analyse reliability of structures is being applied to a variety of engineering problems due to the possibility of establishing the failure probability on rational grounds. In this paper we present the application of classical reliability theory to analyse the safety of underground tunnels.
Resumo:
Este Proyecto de Fin de Carrera presenta un prototipo de aplicación móvil híbrida multi-plataforma para Android y iOS. Las aplicaciones móviles híbridas son una combinación de aplicaciones web móviles y aplicaciones móviles nativas. Se desarrollan parcialmente con tecnologías web y pueden acceder a la capa nativa y sensores del teléfono. Para el usuario se presentan como aplicaciones nativas, ya que se pueden descargar de las tiendas de aplicaciones y son instaladas en el dispositivo. El prototipo consiste en la migración del módulo de noticias financieras de las aplicaciones actuales para móviles de una compañía bancaria reimplementándolo como aplicación híbrida utilizando uno de los entornos de desarrollo disponibles en el mercado para este propósito. El desarrollo de aplicaciones híbridas puede ahorrar tiempo y dinero cuando se pretende alcanzar más de una plataforma móvil. El objetivo es la evaluación de las ventajas e inconvenientes que ofrece el desarrollo de aplicaciones híbridas en términos de reducción de costes, tiempo de desarrollo y resultado final de la aplicación. El proyecto consta de varias fases. Durante la primera fase se realiza un estudio sobre las aplicaciones híbridas que podemos encontrar hoy en día en el mercado utilizando los ejemplos de linkedIn, Facebook y Financial times. Se hace hincapié en las tecnologías utilizadas, uso de la red móvil y problemas encontrados. Posteriormente se realiza una comparación de distintos entornos de desarrollo multi-plataforma para aplicaciones híbridas en términos de la estrategia utilizada, plataformas soportadas, lenguajes de programación, acceso a capacidades nativas de los dispositivos y licencias de uso. Esta primera fase da como resultado la elección del entorno de desarrollo más adecuado a las exigencias del proyecto, que es PhoneGap, y continua con un análisis más detallado de dicho entorno en cuanto a su arquitectura, características y componentes. La siguiente fase comienza con un estudio de las aplicaciones actuales de la compañía para extraer el código fuente necesario y adaptarlo a la arquitectura que tendrá la aplicación. Para la realización del prototipo se hace uso de la característica que ofrece PhoneGap para acceder a la capa nativa del dispositivo, esto es, el uso de plugins. Se diseña y desarrolla un plugin que permite acceder a la capa nativa para cada plataforma. Una vez desarrollado el prototipo para la plataforma Android, se migra y adapta para la plataforma iOS. Por último se hace una evaluación de los prototipos en cuanto a su facilidad y tiempo de desarrollo, rendimiento, funcionalidad y apariencia de la interfaz de usuario. ABSTRACT. This bachelor's thesis presents a prototype of a hybrid cross-platform mobile application for Android and iOS. Hybrid mobile applications are a combination of mobile web and mobile native applications. They are built partially with web technologies and they can also access native features and sensors of the device. For a user, they look like native applications as they are downloaded from the application stores and installed on the device. This prototype consists of the migration of the financial news module of current mobile applications from a financial bank reimplementing them as a hybrid application using one of the frameworks available in the market for that purpose. Development of applications on a hybrid way can help reducing costs and effort when targeting more than one platform. The target of the project is the evaluation of the advantages and disadvantages that hybrid development can offer in terms of reducing costs and efforts and the final result of the application. The project starts with an analysis of successfully released hybrid applications using the examples of linkedIn, Facebook and Financial Times, emphasizing the different used technologies, the transmitted network data and the encountered problems during the development. This analysis is followed by a comparison of most popular hybrid crossplatform development frameworks in terms of the different approaches, supported platforms, programming languages, access to native features and license. This first stage has the outcome of finding the development framework that best fits to the requirements of the project, that is PhoneGap, and continues with a deeper analysis of its architecture, features and components. Next stage analyzes current company's applications to extract the needed source code and adapt it to the architecture of the prototype. For the realization of the application, the feature that PhoneGap offers to access the native layer of the device is used. This feature is called plugin. A custom plugin is designed and developed to access the native layer of each targeted platform. Once the prototype is finished for Android, it is migrated and adapted to the iOS platform. As a final conclusion the prototypes are evaluated in terms of ease and time of development, performance, functionality and look and feel.
Resumo:
We analyze the gain-switching dynamics of two-section tapered lasers by means of a simplified three-rate-equation model. The goal is to improve the understanding of the underlying physics and to optimize the device geometry to achieve high power short duration optical pulses.
Resumo:
Case-based reasoning (CBR) is a unique tool for the evaluation of possible failure of firms (EOPFOF) for its eases of interpretation and implementation. Ensemble computing, a variation of group decision in society, provides a potential means of improving predictive performance of CBR-based EOPFOF. This research aims to integrate bagging and proportion case-basing with CBR to generate a method of proportion bagging CBR for EOPFOF. Diverse multiple case bases are first produced by multiple case-basing, in which a volume parameter is introduced to control the size of each case base. Then, the classic case retrieval algorithm is implemented to generate diverse member CBR predictors. Majority voting, the most frequently used mechanism in ensemble computing, is finally used to aggregate outputs of member CBR predictors in order to produce final prediction of the CBR ensemble. In an empirical experiment, we statistically validated the results of the CBR ensemble from multiple case bases by comparing them with those of multivariate discriminant analysis, logistic regression, classic CBR, the best member CBR predictor and bagging CBR ensemble. The results from Chinese EOPFOF prior to 3 years indicate that the new CBR ensemble, which significantly improved CBRs predictive ability, outperformed all the comparative methods.
Resumo:
This paper summarizes the research activities focused on the behaviour of concrete and concrete structures subjected to blast loading carried out by the Department of Materials Science of the Technical University of Madrid (PUM). These activities comprise the design and construction of a test bench that allows for testing up to four planar concrete specimens with one single explosion, the study of the performance of different protection concepts for concrete structures and, finally, the development of a numerical model for the simulation of concrete structural elements subjected to blast. Up to date 6 different types of concrete have been studied, from plain normal strength concrete, to high strength concrete, including also fibre reinforced concretes with different types of fibres. The numerical model is based on the Cohesive Crack Model approach, and has been developed for the LSDYNA finite element code through a user programmed subroutine. Despite its simplicity, the model is able to predict the failure patterns of the concrete slabs tested with a high level of accuracy
Resumo:
The optical and radio-frequency spectra of a monolithic master-oscillator power-amplifier emitting at 1.5 ?m have been analyzed in a wide range of steady-state injection conditions. The analysis of the spectral maps reveals that, under low injection current of the master oscillator, the device operates in two essentially different operation modes depending on the current injected into the amplifier section. The regular operation mode with predominance of the master oscillator alternates with lasing of the compound cavity modes allowed by the residual reflectance of the amplifier front facet. The quasi-periodic occurrence of these two regimes as a function of the amplifier current has been consistently interpreted in terms of a thermally tuned competition between the modes of the master oscillator and the compound cavity modes.
Resumo:
Los ensayos virtuales de materiales compuestos han aparecido como un nuevo concepto dentro de la industria aeroespacial, y disponen de un vasto potencial para reducir los enormes costes de certificación y desarrollo asociados con las tediosas campañas experimentales, que incluyen un gran número de paneles, subcomponentes y componentes. El objetivo de los ensayos virtuales es sustituir algunos ensayos por simulaciones computacionales con alta fidelidad. Esta tesis es una contribución a la aproximación multiescala desarrollada en el Instituto IMDEA Materiales para predecir el comportamiento mecánico de un laminado de material compuesto dadas las propiedades de la lámina y la intercara. La mecánica de daño continuo (CDM) formula el daño intralaminar a nivel constitutivo de material. El modelo de daño intralaminar se combina con elementos cohesivos para representar daño interlaminar. Se desarrolló e implementó un modelo de daño continuo, y se aplicó a configuraciones simples de ensayos en laminados: impactos de baja y alta velocidad, ensayos de tracción, tests a cortadura. El análisis del método y la correlación con experimentos sugiere que los métodos son razonablemente adecuados para los test de impacto, pero insuficientes para el resto de ensayos. Para superar estas limitaciones de CDM, se ha mejorado la aproximación discreta de elementos finitos enriqueciendo la cinemática para incluir discontinuidades embebidas: el método extendido de los elementos finitos (X-FEM). Se adaptó X-FEM para un esquema explícito de integración temporal. El método es capaz de representar cualitativamente los mecanismos de fallo detallados en laminados. Sin embargo, los resultados muestran inconsistencias en la formulación que producen resultados cuantitativos erróneos. Por último, se ha revisado el método tradicional de X-FEM, y se ha desarrollado un nuevo método para superar sus limitaciones: el método cohesivo X-FEM estable. Las propiedades del nuevo método se estudiaron en detalle, y se concluyó que el método es robusto para implementación en códigos explícitos dinámicos escalables, resultando una nueva herramienta útil para la simulación de daño en composites. Virtual testing of composite materials has emerged as a new concept within the aerospace industry. It presents a very large potential to reduce the large certification costs and the long development times associated with the experimental campaigns, involving the testing of a large number of panels, sub-components and components. The aim of virtual testing is to replace some experimental tests by high-fidelity numerical simulations. This work is a contribution to the multiscale approach developed in Institute IMDEA Materials to predict the mechanical behavior of a composite laminate from the properties of the ply and the interply. Continuum Damage Mechanics (CDM) formulates intraply damage at the the material constitutive level. Intraply CDM is combined with cohesive elements to model interply damage. A CDM model was developed, implemented, and applied to simple mechanical tests of laminates: low and high velocity impact, tension of coupons, and shear deformation. The analysis of the results and the comparison with experiments indicated that the performance was reasonably good for the impact tests, but insuficient in the other cases. To overcome the limitations of CDM, the kinematics of the discrete finite element approximation was enhanced to include mesh embedded discontinuities, the eXtended Finite Element Method (X-FEM). The X-FEM was adapted to an explicit time integration scheme and was able to reproduce qualitatively the physical failure mechanisms in a composite laminate. However, the results revealed an inconsistency in the formulation that leads to erroneous quantitative results. Finally, the traditional X-FEM was reviewed, and a new method was developed to overcome its limitations, the stable cohesive X-FEM. The properties of the new method were studied in detail, and it was demonstrated that the new method was robust and can be implemented in a explicit finite element formulation, providing a new tool for damage simulation in composite materials.