929 resultados para System reliability index
Resumo:
This article presents the findings of a field research, not experimental, observational, correlating, basic, of mixed data, micro sociologic, leading to a study of surveys.The object of study is to find learning kinds, and the unit of analysis were 529 high school students between 16 and 21 years old. Its purpose is to understand the impact of learning by rote, guided, self learned and meaningful learning and its achievement degree besides the learning outcomes of differentiated curriculum based on David Ausubel's thoughts, associated with different economic specialties (MINEDUC, 1998) where the population of the study is trained. To collect data, the test TADA - DO2 was used, this test has a reliability index of 0.911 according to Cronbach. From the hits it can be stated from the null hypothesis that there is a significant association (a = 0,05) between the learning kinds and the learning expected of differentiated training plan for both, male and female. It is complex to state that the training of the middle-level technicians leads to a successful employment.
Resumo:
This article presents the findings of a field research, not experimental, observational, correlating, basic, of mixed data, micro sociologic, leading to a study of surveys.The object of study is to find learning kinds, and the unit of analysis were 529 high school students between 16 and 21 years old. Its purpose is to understand the impact of learning by rote, guided, self learned and meaningful learning and its achievement degree besides the learning outcomes of differentiated curriculum based on David Ausubel's thoughts, associated with different economic specialties (MINEDUC, 1998) where the population of the study is trained. To collect data, the test TADA - DO2 was used, this test has a reliability index of 0.911 according to Cronbach. From the hits it can be stated from the null hypothesis that there is a significant association (a = 0,05) between the learning kinds and the learning expected of differentiated training plan for both, male and female. It is complex to state that the training of the middle-level technicians leads to a successful employment.
Resumo:
In recent decades, there has been an increasing interest in systems comprised of several autonomous mobile robots, and as a result, there has been a substantial amount of development in the eld of Articial Intelligence, especially in Robotics. There are several studies in the literature by some researchers from the scientic community that focus on the creation of intelligent machines and devices capable to imitate the functions and movements of living beings. Multi-Robot Systems (MRS) can often deal with tasks that are dicult, if not impossible, to be accomplished by a single robot. In the context of MRS, one of the main challenges is the need to control, coordinate and synchronize the operation of multiple robots to perform a specic task. This requires the development of new strategies and methods which allow us to obtain the desired system behavior in a formal and concise way. This PhD thesis aims to study the coordination of multi-robot systems, in particular, addresses the problem of the distribution of heterogeneous multi-tasks. The main interest in these systems is to understand how from simple rules inspired by the division of labor in social insects, a group of robots can perform tasks in an organized and coordinated way. We are mainly interested on truly distributed or decentralized solutions in which the robots themselves, autonomously and in an individual manner, select a particular task so that all tasks are optimally distributed. In general, to perform the multi-tasks distribution among a team of robots, they have to synchronize their actions and exchange information. Under this approach we can speak of multi-tasks selection instead of multi-tasks assignment, which means, that the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation ix of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. In addition, it is very interesting the evaluation of the results in function in each approach, comparing the results obtained by the introducing noise in the number of pending loads, with the purpose of simulate the robot's error in estimating the real number of pending tasks. The main contribution of this thesis can be found in the approach based on self-organization and division of labor in social insects. An experimental scenario for the coordination problem among multiple robots, the robustness of the approaches and the generation of dynamic tasks have been presented and discussed. The particular issues studied are: Threshold models: It presents the experiments conducted to test the response threshold model with the objective to analyze the system performance index, for the problem of the distribution of heterogeneous multitasks in multi-robot systems; also has been introduced additive noise in the number of pending loads and has been generated dynamic tasks over time. Learning automata methods: It describes the experiments to test the learning automata-based probabilistic algorithms. The approach was tested to evaluate the system performance index with additive noise and with dynamic tasks generation for the same problem of the distribution of heterogeneous multi-tasks in multi-robot systems. Ant colony optimization: The goal of the experiments presented is to test the ant colony optimization-based deterministic algorithms, to achieve the distribution of heterogeneous multi-tasks in multi-robot systems. In the experiments performed, the system performance index is evaluated by introducing additive noise and dynamic tasks generation over time.
Resumo:
El frente de un túnel puede colapsar si la presión aplicada sobre el es inferior a un valor limite denominado presión “critica” o “de colapso”. En este trabajo se desarrolla y presenta un mecanismo de rotura rotacional generado punto a punto para el cálculo de la presión de colapso del frente de túneles excavados en terrenos estratificados o en materiales que siguen un criterio de rotura nolineal. La solución propuesta es una solución de contorno superior en el marco del Análisis Límite y supone una generalización del mecanismo de rotura mas reciente existente en la bibliografía. La presencia de un terreno estratificado o con un criterio de rotura no-lineal implica una variabilidad espacial de las propiedades resistentes. Debido a esto, se generaliza el mecanismo desarrollado por Mollon et al. (2011b) para suelos, de tal forma que se puedan considerar valores locales del ángulo de rozamiento y de la cohesión. Además, la estratificación del terreno permite una rotura parcial del frente, por lo que se implementa esta posibilidad en el mecanismo, siendo la primera solución que emplea un mecanismo de rotura que se ajusta a la estratigrafía del terreno. Por otro lado, la presencia de un material con un criterio de rotura no-lineal exige introducir en el modelo, como variable de estudio, el estado tensional en el frente, el cual se somete al mismo proceso de optimización que las variables geométricas del mecanismo. Se emplea un modelo numérico 3D para validar las predicciones del mecanismo de Análisis Limite, demostrando que proporciona, con un esfuerzo computacional significativamente reducido, buenas predicciones de la presión critica, del tipo de rotura (global o parcial) en terrenos estratificados y de la geometría de fallo. El mecanismo validado se utiliza para realizar diferentes estudios paramétricos sobre la influencia de la estratigrafía en la presión de colapso. Igualmente, se emplea para elaborar cuadros de diseño de la presión de colapso para túneles ejecutados con tuneladora en macizos rocosos de mala calidad y para analizar la influencia en la estabilidad del frente del método constructivo. Asimismo, se lleva a cabo un estudio de fiabilidad de la estabilidad del frente de un túnel excavado en un macizo rocoso altamente fracturado. A partir de el se analiza como afectan las diferentes hipótesis acerca de los tipos de distribución y de las estructuras de correlación a los resultados de fiabilidad. Se investiga también la sensibilidad de los índices de fiabilidad a los cambios en las variables aleatorias, identificando las mas relevantes para el diseño. Por ultimo, se lleva a cabo un estudio experimental mediante un modelo de laboratorio a escala reducida. El modelo representa medio túnel, lo cual permite registrar el movimiento del material mediante una técnica de correlación de imágenes fotográficas. El ensayo se realiza con una arena seca y se controla por deformaciones mediante un pistón que simula el frente. Los resultados obtenidos se comparan con las estimaciones de la solución de Análisis Límite, obteniéndose un ajuste razonable, de acuerdo a la literatura, tanto en la geometría de rotura como en la presión de colapso. A tunnel face may collapse if the applied support pressure is lower than a limit value called the ‘critical’ or ‘collapse’ pressure. In this work, an advanced rotational failure mechanism generated ‘‘point-by-point” is developed to compute the collapse pressure for tunnel faces in layered (or stratified) grounds or in materials that follow a non-linear failure criterion. The proposed solution is an upper bound solution in the framework of limit analysis which extends the most advanced face failure mechanism in the literature. The excavation of the tunnel in a layered ground or in materials with a non-linear failure criterion may lead to a spatial variability of the strength properties. Because of this, the rotational mechanism recently proposed by Mollon et al. (2011b) for Mohr-Coulomb soils is generalized so that it can consider local values of the friction angle and of the cohesion. For layered soils, the mechanism needs to be extended to consider the possibility for partial collapse. The proposed methodology is the first solution with a partial collapse mechanism that can fit to the stratification. Similarly, the use of a nonlinear failure criterion introduces the need to introduce new parameters in the optimization problem to consider the distribution of normal stresses along the failure surface. A 3D numerical model is employed to validate the predictions of the limit analysis mechanism, demonstrating that it provides, with a significantly reduced computational effort, good predictions of critical pressure, of the type of collapse (global or partial) in layered soils, and of its geometry. The mechanism is then employed to conduct parametric studies of the influence of several geometrical and mechanical parameters on face stability of tunnels in layered soils. Similarly, the methodology has been further employed to develop simple design charts that provide the face collapse pressure of tunnels driven by TBM in low quality rock masses and to study the influence of the construction method. Finally, a reliability analysis of the stability of a tunnel face driven in a highly fractured rock mass is performed. The objective is to analyze how different assumptions about distributions types and correlation structures affect the reliability results. In addition, the sensitivity of the reliability index to changes in the random variables is studied, identifying the most relevant variables for engineering design. Finally, an experimental study is carried out using a small-scale laboratory model. The problem is modeled in half, cutting through the tunnel axis vertically, so that displacements of soil particles can be recorded by a digital image correlation technique. The tests were performed with dry sand and displacements are controlled by a piston that supports the soil. The results of the model are compared with the predictions of the Limit Analysis mechanism. A reasonable agreement, according to literature, is obtained between the shapes of the failure surfaces and between the collapse pressures observed in the model tests and computed with the analytical solution.
Resumo:
O método construtivo com painéis portantes de concreto é economicamente viável, porém relativamente novo no cenário nacional, sobretudo no caso dos pré-moldados. As incertezas referentes às peculiaridades desse método, bem como a nova norma brasileira de painéis pré-moldados, ainda em elaboração, vem a motivar uma análise probabilística dos critérios de projeto disponíveis. Utilizando-se a técnica da confiabilidade estrutural, é possível propagar as incertezas referentes às variáveis a uma resposta final no índice de confiabilidade, sendo um cálculo totalmente probabilístico. Neste trabalho, emprega-se tal técnica com informações estatísticas referentes a lajes de concreto moldadas in loco para verificar, de maneira mais verossímil, a segurança dos critérios de projeto impostos pelo Precast Concrete Institute Design Handbook - Precast and Prestressed Concrete - 7th Edition (2010) às fases transitórias (desforma, transporte e içamento) e pela Norma Brasileira ABNT NBR 6118: 2014 - Projeto de estruturas de concreto, à fase em uso. Prossegue-se a uma análise crítica dos resultados bem como sugestões para diminuir a variação dos resultados, sobretudo pela calibração de novos coeficientes parciais de segurança, processo para o qual este trabalho pode servir de base.
Resumo:
The use of microprocessor-based systems is gaining importance in application domains where safety is a must. For this reason, there is a growing concern about the mitigation of SEU and SET effects. This paper presents a new hybrid technique aimed to protect both the data and the control-flow of embedded applications running on microprocessors. On one hand, the approach is based on software redundancy techniques for correcting errors produced in the data. On the other hand, control-flow errors can be detected by reusing the on-chip debug interface, existing in most modern microprocessors. Experimental results show an important increase in the system reliability even superior to two orders of magnitude, in terms of mitigation of both SEUs and SETs. Furthermore, the overheads incurred by our technique can be perfectly assumable in low-cost systems.
Resumo:
"The Bestool system : subject-index for a private library. Second edition": p. [17]-63.
Resumo:
The worldwide trend for the deregulation of the electricity generation and transmission industries has led to dramatic changes in system operation and planning procedures. The optimum approach to transmission-expansion planning in a deregulated environment is an open problem especially when the responsibilities of the organisations carrying out the planning work need to be addressed. To date there is a consensus that the system operator and network manager perform the expansion planning work in a centralised way. However, with an increasing input from the electricity market, the objectives, constraints and approaches toward transmission planning should be carefully designed to ensure system reliability as well as meeting the market requirements. A market-oriented approach for transmission planning in a deregulated environment is proposed. Case studies using the IEEE 14-bus system and the Australian national electricity market grid are performed. In addition, the proposed method is compared with a traditional planning method to further verify its effectiveness.
Resumo:
Market administrators hold the vital role of maintaining sufficient generation capacity in their respective electricity market. However without the jurisdiction to dictate the generator types, locations and timing of new generation, the reliability of the system may be compromised by delayed entry of new generation. This paper illustrates a new generation investment methodology that can effectively present expected returns from the pool market; while concurrently searching for the type and placement of a new generator to fulfil system reliability requirements.
Resumo:
This thesis examines experimentally options for optical fibre transmission over oceanic distances. Its format follows the chronological evolution of ultra-long haul optical systems, commencing with opto-electronic regenerators as repeaters, progressing to optically amplified NRZ systems and finally solitonic propagation. In each case recirculating loop techniques are deployed to simplify the transmission experiments. Advances in high speed electronics have allowed regenerators operating at 10 Gbit/s to become a practical reality. By augmenting such devices with optical amplifiers it is possible to greatly enhance the repeater spacing. Work detailed in this thesis has culminated in the propagation of 10 Gbit/s data over 400,000 km with a repeater spacing of 160 km. System reliability and robustness are enhanced by the use of a directly modulated DFB laser transmitter and total insensitivity of the system to the signal state of polarisation. Optically amplified ultra-long haul NRZ systems have taken on particular importance with the impending deployment of TAT 12/13 and TPC 5. The performance of these systems is demonstrated to be primarily limited by analogue impairments such as the accumulation of amplifier noise, polarisation effects and optical non-linearities. These degradations may be reduced by the use of appropriate dispersion maps and by scrambling the transmitted state of signal polarisation. A novel high speed optically passive polarisation scrambler is detailed for the first time. At bit rates in excess of 10 Gbit/s it is shown that these systems are severely limited and do not offer the advantages that might be expected over regenerated links. Propagation using solitons as the data bits appears particularly attractive since the dispersive and non-linear effects of the fibre allow distortion free transmission. However, the generation of pure solitons is difficult but must be achieved if the uncontrolled transmission distance is to be maximised. This thesis presents a new technique for the stabilisation of an erbium fibre ring laser that has aUowed propagation of 2.5 Gbit/s solitons to the theoretical limit of ~ 18,000 km. At higher bit rates temporal jitter becomes a significant impairment and to aUow an increase in the aggregate line rate multiplexing in both time and polarisation domains has been proposed. These techniques are shown to be of only limited benefit in practical systems and ultimately some form of soliton transmission control is required. The thesis demonstrates synchronous retiming by amplitude modulation that has allowed 20 Gbit/s data to propagate 125,000 km error free with an amplifier spacing approaching the soliton period. Ultimately the speed of operation of such systems is limited by the electronics used and, thus, a new form of soliton control is demonstrated using all optical techniques to achieve synchronous phase modulation.
Resumo:
We report on the operational parameters that are required to fabricate buried, microstructured waveguides in a z-cut lithium niobate crystal by the method of direct femtosecond laser inscription using a highrepetition-rate, chirped-pulse oscillator system. Refractive index contrasts as high as −0.0127 have been achieved for individual modification tracks. The results pave the way for developing microstructured WGs with low-loss operation across a wide spectral range, extending into the mid-infrared region up to the end of the transparency range of the host material.
Resumo:
We review our recent work on the numerical design and optimisation of buried, micro-structured waveguides (WGs) that can be formed in a lithium niobate (LiNbO3) crystal by the method of direct femtosecond laser inscription. We also report on the possibility of fabricating such WGs using a high-repetition-rate, chirped-pulse oscillator system. Refractive index contrasts as high as -0.0127 have been achieved for individual modification tracks. The results pave the way for developing micro-structured WGs with low-loss operation across a wide spectral range, extending into the mid-infrared region up to the end of the transparency range of the host material. © 2014 IEEE.
Resumo:
The long-term foetal surveillance is often to be recommended. Hence, the fully non-invasive acoustic recording, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the recorded heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. In this paper, we present a new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings. A filtering is employed as a first step of the algorithm to reduce the background noise. A block for first heart sounds enhancing is then used to further reduce other components of foetal heart sound signals. A complex logic block, guided by a number of rules concerning foetal heart beat regularity, is proposed as a successive block, for the detection of most probable first heart sounds from several candidates. A final block is used for exact first heart sound timing and in turn foetal heart rate estimation. Filtering and enhancing blocks are actually implemented by means of different techniques, so that different processing paths are proposed. Furthermore, a reliability index is introduced to quantify the consistency of the estimated foetal heart rate and, based on statistic parameters; [,] a software quality index is designed to indicate the most reliable analysis procedure (that is, combining the best processing path and the most accurate time mark of the first heart sound, provides the lowest estimation errors). The algorithm performances have been tested on phonocardiographic signals recorded in a local gynaecology private practice from a sample group of about 50 pregnant women. Phonocardiographic signals have been recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by our algorithm and the other provided by cardiotocographic device). Our results show that the proposed algorithm, in particular some analysis procedures, provides reliable foetal heart rate signals, very close to the reference cardiotocographic recordings. © 2010 Elsevier Ltd. All rights reserved.
Resumo:
The high cost of batteries has led to investigations in using second-life ex-transportation batteries for grid support applications. Vehicle manufacturers currently all have different specifications for battery chemistry, arrangement of cells, capacity and voltage. With anticipated new developments in battery chemistry which could also affect these parameters, there are, as yet, no standards defining parameters in second life applications. To overcome issues relating to sizing and to prevent future obsolescence for the rest of the energy storage system, a cascaded topology with an operating envelope design approach has been used to connect together modules. This topology offers advantages in terms of system reliability. The design methodology is validated through a set of experimental results resulting in the creation of surface maps looking at the operation of the converter (efficiency and inductor ripple current). The use of a pre-defined module operating envelope also offers advantages for developing new operational strategies for systems with both hybrid battery energy systems and also hybrid systems including other energy sources such as solar power.
Resumo:
The Three-Layer distributed mediation architecture, designed by Secure System Architecture laboratory, employed a layered framework of presence, integration, and homogenization mediators. The architecture does not have any central component that may affect the system reliability. A distributed search technique was adapted in the system to increase its reliability. An Enhanced Chord-like algorithm (E-Chord) was designed and deployed in the integration layer. The E-Chord is a skip-list algorithm based on Distributed Hash Table (DHT) which is a distributed but structured architecture. DHT is distributed in the sense that no central unit is required to maintain indexes, and it is structured in the sense that indexes are distributed over the nodes in a systematic manner. Each node maintains three kind of routing information: a frequency list, a successor/predecessor list, and a finger table. None of the nodes in the system maintains all indexes, and each node knows about some other nodes in the system. These nodes, also called composer mediators, were connected in a P2P fashion. ^ A special composer mediator called a global mediator initiates the keyword-based matching decomposition of the request using the E-Chord. It generates an Integrated Data Structure Graph (IDSG) on the fly, creates association and dependency relations between nodes in the IDSG, and then generates a Global IDSG (GIDSG). The GIDSG graph is a plan which guides the global mediator how to integrate data. It is also used to stream data from the mediators in the homogenization layer which connected to the data sources. The connectors start sending the data to the global mediator just after the global mediator creates the GIDSG and just before the global mediator sends the answer to the presence mediator. Using the E-Chord and GIDSG made the mediation system more scalable than using a central global schema repository since all the composers in the integration layer are capable of handling and routing requests. Also, when a composer fails, it would only minimally affect the entire mediation system. ^