840 resultados para operational reliability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of the RAMS (Reliability, Availability, Maintainability and Safety) performances of system generally includes the evaluations of the “Importance” of its components and/or of the basic parameters of the model through the use of the Importance Measures. The analytical equations proposed in this study allow the estimation of the first order Differential Importance Measure on the basis of the Birnbaum measures of components, under the hypothesis of uniform percentage changes of parameters. The aging phenomena are introduced into the model by assuming exponential-linear or Weibull distributions for the failure probabilities. An algorithm based on a combination of MonteCarlo simulation and Cellular Automata is applied in order to evaluate the performance of a networked system, made up of source nodes, user nodes and directed edges subjected to failure and repair. Importance Sampling techniques are used for the estimation of the first and total order Differential Importance Measures through only one simulation of the system “operational life”. All the output variables are computed contemporaneously on the basis of the same sequence of the involved components, event types (failure or repair) and transition times. The failure/repair probabilities are forced to be the same for all components; the transition times are sampled from the unbiased probability distributions or it can be also forced, for instance, by assuring the occurrence of at least a failure within the system operational life. The algorithm allows considering different types of maintenance actions: corrective maintenance that can be performed either immediately upon the component failure or upon finding that the component has failed for hidden failures that are not detected until an inspection; and preventive maintenance, that can be performed upon a fixed interval. It is possible to use a restoration factor to determine the age of the component after a repair or any other maintenance action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Slender and lighter footbridges are becoming more and more popular to meet the transportation demand and the aesthetical requirements of the modern society. The widespread presence of such particular structures has become possible thanks to the availability of new, lightweight and still capable of carrying heavy loads material . Therefore, these kind of structure, are particularly sensitive to vibration serviceability problems, especially induced by human activities. As a consequence, it has been imperative to study the dynamic behaviour of such slender pedestrian bridges in order to define their modal characteristics. As an alternative to a Finite Element Analysis to find natural frequencies, damping and mode shape, a so-called Operational Modal Analysis is a valid tool to obtain these parameters through an ambient vibration test. This work provides a useful insight into the Operational Modal Analysis technique and It reports the investigation of the CEME Skywalk, a pedestrian bridge located at the University of British Columbia, in Vancouver, Canada. Furthermore, human-induced vibration tests have been performed and the dynamic characteristics derived with these tests have been compared with the ones from the ambient vibration tests. The effect of the dynamic properties of the two buildings supporting the CEME Skywalk on the dynamic behaviour of the bridge has been also investigated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Next generation electronic devices have to guarantee high performance while being less power-consuming and highly reliable for several application domains ranging from the entertainment to the business. In this context, multicore platforms have proven the most efficient design choice but new challenges have to be faced. The ever-increasing miniaturization of the components produces unexpected variations on technological parameters and wear-out characterized by soft and hard errors. Even though hardware techniques, which lend themselves to be applied at design time, have been studied with the objective to mitigate these effects, they are not sufficient; thus software adaptive techniques are necessary. In this thesis we focus on multicore task allocation strategies to minimize the energy consumption while meeting performance constraints. We firstly devise a technique based on an Integer Linear Problem formulation which provides the optimal solution but cannot be applied on-line since the algorithm it needs is time-demanding; then we propose a sub-optimal technique based on two steps which can be applied on-line. We demonstrate the effectiveness of the latter solution through an exhaustive comparison against the optimal solution, state-of-the-art policies, and variability-agnostic task allocations by running multimedia applications on the virtual prototype of a next generation industrial multicore platform. We also face the problem of the performance and lifetime degradation. We firstly focus on embedded multicore platforms and propose an idleness distribution policy that increases core expected lifetimes by duty cycling their activity; then, we investigate the use of micro thermoelectrical coolers in general-purpose multicore processors to control the temperature of the cores at runtime with the objective of meeting lifetime constraints without performance loss.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Originalsprache (englisch) Visual perception relies on a two-dimensional projection of the viewed scene on the retinas of both eyes. Thus, visual depth has to be reconstructed from a number of different cues that are subsequently integrated to obtain robust depth percepts. Existing models of sensory integration are mainly based on the reliabilities of individual cues and disregard potential cue interactions. In the current study, an extended Bayesian model is proposed that takes into account both cue reliability and consistency. Four experiments were carried out to test this model's predictions. Observers had to judge visual displays of hemi-cylinders with an elliptical cross section, which were constructed to allow for an orthogonal variation of several competing depth cues. In Experiment 1 and 2, observers estimated the cylinder's depth as defined by shading, texture, and motion gradients. The degree of consistency among these cues was systematically varied. It turned out that the extended Bayesian model provided a better fit to the empirical data compared to the traditional model which disregards covariations among cues. To circumvent the potentially problematic assessment of single-cue reliabilities, Experiment 3 used a multiple-observation task, which allowed for estimating perceptual weights from multiple-cue stimuli. Using the same multiple-observation task, the integration of stereoscopic disparity, shading, and texture gradients was examined in Experiment 4. It turned out that less reliable cues were downweighted in the combined percept. Moreover, a specific influence of cue consistency was revealed. Shading and disparity seemed to be processed interactively while other cue combinations could be well described by additive integration rules. These results suggest that cue combination in visual depth perception is highly flexible and depends on single-cue properties as well as on interrelations among cues. The extension of the traditional cue combination model is defended in terms of the necessity for robust perception in ecologically valid environments and the current findings are discussed in the light of emerging computational theories and neuroscientific approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After the development of power electronics converters, the number of transformers subjected to non-sinusoidal stresses (including DC) has increased in applications such as HVDC links and traction (electric train power cars). The effects of non-sinusoidal voltages on transformer insulation have been investigated by many researchers, but still now, there are some issues that must be understood. Some of those issues are tackled in this Thesis, studying PD phenomena behavior in Kraft paper, pressboard and mineral oil at different voltage conditions like AC, DC, AC+DC, notched AC and square waveforms. From the point of view of converter transformers, it was found that the combined effect of AC and DC voltages produces higher stresses in the pressboard that those that are present under pure DC voltages. The electrical conductivity of the dielectric systems in DC and AC+DC conditions has demonstrated to be a critical parameter, so, its measurement and analysis was also taken into account during all the experiments. Regarding notched voltages, the RMS reduction caused by notches (depending on firing and overlap angles) seems to increase the PDIV. However, the experimental results show that once PD activity has incepted, the notches increase PD repetition rate and magnitude, producing a higher degradation rate of paper. On the other hand, the reduction of mineral oil stocks, their relatively low flash point as well as environmental issues, are factors that are pushing towards the use of esters as transformer insulating fluids. This PhD Thesis also covers the study of two different esters with the scope to validate their use in traction transformers. Mineral oil was used as benchmark. The complete set of dielectric tests performed in the three fluids, show that esters behave better than mineral oil in practically all the investigated conditions, so, their application in traction transformers is possible and encouraged.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Questa tesi di laurea è stata redatta presso l’azienda Sacmi Imola S.C. ed in particolare all’interno della divisione Closures, che si occupa della progettazione e della realizzazione di linee per la produzione di varie tipologie di capsule. Lo scopo dell’elaborato è descrivere lo sviluppo di un sistema di tracciabilità di prodotto; sistemi di questo tipo, adottati inizialmente nel settore alimentare, stanno acquisendo sempre maggiore importanza anche in altri campi produttivi, poiché rivestono un ruolo strategico al fine della realizzazione di prodotti caratterizzati da livelli elevati di performance e di qualità, capaci di emergere nel mercato moderno caratterizzato da una concorrenza estesa a livello mondiale e molto attento alle esigenze dei clienti. Nel caso specifico di Sacmi il sistema di tracciabilità si rivolge ad una pressa, la CCM (Continuous Compression Moulder), realizzata dall’azienda per la produzione di capsule in materiale termoplastico tramite la tecnologia dello stampaggio a compressione. In particolare il sistema si concentra sugli stampi della macchina CCM, i quali ne rappresentano gli elementi critici dal punto di vista sia tecnico che economico. A livello generale, un sistema di tracciabilità è costituito da due componenti fondamentali: il primo è un sistema di identificazione che permetta di rendere distinguibili ed individuabili le unità da tracciare, mentre il secondo è un sistema di raccolta dati in grado di raccogliere le informazioni desiderate. Queste sono poi archiviate in un apposito database ed attribuite alle entità corrispondenti sfruttando le proprietà del sistema di identificazione. Il primo passo da compiere quando si intende sviluppare un sistema di tracciabilità all’interno di un contesto produttivo già consolidato è la ricostruzione del processo produttivo presente in azienda: si tratta di individuare tutti gli enti aziendali che concorrono al processo e che saranno interessati dall’introduzione del nuovo sistema. Una volta definiti gli attori, è necessario anche capire come questi siano collegati dai flussi di materiale e di informazioni. Il processo produttivo di Sacmi era caratterizzato dalla quasi totale assenza di un flusso strutturato di informazioni a supporto di quello di materiale, ed il sistema di tracciabilità ha provveduto a colmare proprio questa mancanza. Il sistema deve essere in grado di integrarsi perfettamente nel contesto produttivo aziendale: è necessario trovare il giusto compromesso per quanto riguarda la quantità di informazioni da raccogliere, che devono garantire una corretta copertura di tutto il processo senza però appesantirlo eccessivamente. E’ bene che la raccolta dati sia basata su procedure standard che assicurino la ripetibilità delle operazioni di prelievo delle informazioni. Come è logico immaginarsi, l’introduzione di numerose novità nell’ambito di un contesto già strutturato ha fatto emergere un certo numero di problematiche, come ad esempio difficoltà nello stoccaggio e ritardi di produzione; queste devono essere risolte chiedendo uno sforzo aggiuntivo agli enti interessati o, nel medio/lungo periodo, evolvendo ed affinando il sistema con soluzioni più snelle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we focus on optimization and simulation techniques applied to solve strategic, tactical and operational problems rising in the healthcare sector. At first we present three applications to Emilia-Romagna Public Health System (SSR) developed in collaboration with Agenzia Sanitaria e Sociale dell'Emilia-Romagna (ASSR), a regional center for innovation and improvement in health. Agenzia launched a strategic campaign aimed at introducing Operations Research techniques as decision making tools to support technological and organizational innovations. The three applications focus on forecast and fund allocation of medical specialty positions, breast screening program extension and operating theater planning. The case studies exploit the potential of combinatorial optimization, discrete event simulation and system dynamics techniques to solve resource constrained problem arising within Emilia-Romagna territory. We then present an application in collaboration with Dipartimento di Epidemiologia del Lazio that focuses on population demand of service allocation to regional emergency departments. Finally, a simulation-optimization approach, developed in collaboration with INESC TECH center of Porto, to evaluate matching policies for the kidney exchange problem is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coral reefs are the most biodiverse ecosystems of the ocean and they provide notable ecosystem services. Nowadays, they are facing a number of local anthropogenic threats and environmental change is threatening their survivorship on a global scale. Large-scale monitoring is necessary to understand environmental changes and to perform useful conservation measurements. Governmental agencies are often underfunded and are not able of sustain the necessary spatial and temporal large-scale monitoring. To overcome the economic constrains, in some cases scientists can engage volunteers in environmental monitoring. Citizen Science enables the collection and analysis of scientific data at larger spatial and temporal scales than otherwise possible, addressing issues that are otherwise logistically or financially unfeasible. “STE: Scuba Tourism for the Environment” was a volunteer-based Red Sea coral reef biodiversity monitoring program. SCUBA divers and snorkelers were involved in the collection of data for 72 taxa, by completing survey questionnaires after their dives. In my thesis, I evaluated the reliability of the data collected by volunteers, comparing their questionnaires with those completed by professional scientists. Validation trials showed a sufficient level of reliability, indicating that non-specialists performed similarly to conservation volunteer divers on accurate transects. Using the data collected by volunteers, I developed a biodiversity index that revealed spatial trends across surveyed areas. The project results provided important feedbacks to the local authorities on the current health status of Red Sea coral reefs and on the effectiveness of the environmental management. I also analysed the spatial and temporal distribution of each surveyed taxa, identifying abundance trends related with anthropogenic impacts. Finally, I evaluated the effectiveness of the project to increase the environmental education of volunteers and showed that the participation in STEproject significantly increased both the knowledge on coral reef biology and ecology and the awareness of human behavioural impacts on the environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims to evaluate the reliability of these levee systems, calculating the probability of “failure” of determined levee stretches under different loads, using probabilistic methods that take into account the fragility curves obtained through the Monte Carlo Method. For this study overtopping and piping are considered as failure mechanisms (since these are the most frequent) and the major levee system of the Po River with a primary focus on the section between Piacenza and Cremona, in the lower-middle area of the Padana Plain, is analysed. The novelty of this approach is to check the reliability of individual embankment stretches, not just a single section, while taking into account the variability of the levee system geometry from one stretch to another. This work takes also into consideration, for each levee stretch analysed, a probability distribution of the load variables involved in the definition of the fragility curves, where it is influenced by the differences in the topography and morphology of the riverbed along the sectional depth analysed as it pertains to the levee system in its entirety. A type of classification is proposed, for both failure mechanisms, to give an indication of the reliability of the levee system based of the information obtained by the fragility curve analysis. To accomplish this work, an hydraulic model has been developed where a 500-year flood is modelled to determinate the residual hazard value of failure for each stretch of levee near the corresponding water depth, then comparing the results with the obtained classifications. This work has the additional the aim of acting as an interface between the world of Applied Geology and Environmental Hydraulic Engineering where a strong collaboration is needed between the two professions to resolve and improve the estimation of hydraulic risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 5th generation of mobile networking introduces the concept of “Network slicing”, the network will be “sliced” horizontally, each slice will be compliant with different requirements in terms of network parameters such as bandwidth, latency. This technology is built on logical instead of physical resources, relies on virtual network as main concept to retrieve a logical resource. The Network Function Virtualisation provides the concept of logical resources for a virtual network function, enabling the concept virtual network; it relies on the Software Defined Networking as main technology to realize the virtual network as resource, it also define the concept of virtual network infrastructure with all components needed to enable the network slicing requirements. SDN itself uses cloud computing technology to realize the virtual network infrastructure, NFV uses also the virtual computing resources to enable the deployment of virtual network function instead of having custom hardware and software for each network function. The key of network slicing is the differentiation of slice in terms of Quality of Services parameters, which relies on the possibility to enable QoS management in cloud computing environment. The QoS in cloud computing denotes level of performances, reliability and availability offered. QoS is fundamental for cloud users, who expect providers to deliver the advertised quality characteristics, and for cloud providers, who need to find the right tradeoff between QoS levels that has possible to offer and operational costs. While QoS properties has received constant attention before the advent of cloud computing, performance heterogeneity and resource isolation mechanisms of cloud platforms have significantly complicated QoS analysis and deploying, prediction, and assurance. This is prompting several researchers to investigate automated QoS management methods that can leverage the high programmability of hardware and software resources in the cloud.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have shown that the nociceptive withdrawal reflex threshold (NWR-T) and the electrical pain threshold (EP-T) are reliable measures in pain-free populations. However, it is necessary to investigate the reliability of these measures in patients with chronic pain in order to translate these techniques from laboratory to clinic. The aims of this study were to determine the test-retest reliability of the NWR-T and EP-T after single and repeated (temporal summation) electrical stimulation in a group of patients with chronic low back pain, and to investigate the association between the NWR-T and the EP-T. To this end, 25 patients with chronic pain participated in three identical sessions, separated by 1 week in average, in which the NWR-T and the EP-T to single and repeated stimulation were measured. Test-retest reliability was assessed using intra-class correlation coefficient (ICC), coefficient of variation (CV), and Bland-Altman analysis. The association between the thresholds was assessed using the coefficient of determination (r (2)). The results showed good-to-excellent reliability for both NWR-T and EP-T in all cases, with average ICC values ranging 0.76-0.90 and average CV values ranging 12.0-17.7%. The association between thresholds was better after repeated stimulation than after single stimulation, with average r (2) values of 0.83 and 0.56, respectively. In conclusion, the NWR-T and the EP-T are reliable assessment tools for assessing the sensitivity of spinal nociceptive pathways in patients with chronic pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

End-stage ankle arthritis should have an appropriate classification to assist surgeons in the management of end-stage ankle arthritis. Outcomes research also requires a classification system to stratify patients appropriately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background To assess the criterion and construct validity of the KIDSCREEN-10 well-being and health-related quality of life (HRQoL) score, a short version of the KIDSCREEN-52 and KIDSCREEN-27 instruments. Methods The child self-report and parent report versions of the KIDSCREEN-10 were tested in a sample of 22,830 European children and adolescents aged 8–18 and their parents (n = 16,237). Correlation with the KIDSCREEN-52 and associations with other generic HRQoL measures, physical and mental health, and socioeconomic status were examined. Score differences by age, gender, and country were investigated. Results Correlations between the 10-item KIDSCREEN score and KIDSCREEN-52 scales ranged from r = 0.24 to 0.72 (r = 0.27–0.72) for the self-report version (proxy-report version). Coefficients below r = 0.5 were observed for the KIDSCREEN-52 dimensions Financial Resources and Being Bullied only. Cronbach alpha was 0.82 (0.78), test–retest reliability was ICC = 0.70 (0.67) for the self- (proxy-)report version. Correlations between other children self-completed HRQoL questionnaires and KIDSCREEN-10 ranged from r = 0.43 to r = 0.63 for the KIDSCREEN children self-report and r = 0.22–0.40 for the KIDSCREEN parent proxy report. Known group differences in HRQoL between physically/mentally healthy and ill children were observed in the KIDSCREEN-10 self and proxy scores. Associations with self-reported psychosomatic complaints were r = −0.52 (−0.36) for the KIDSCREEN-10 self-report (proxy-report). Statistically significant differences in KIDSCREEN-10 self and proxy scores were found by socioeconomic status, age, and gender. Conclusions Our results indicate that the KIDSCREEN-10 provides a valid measure of a general HRQoL factor in children and adolescents, but the instrument does not represent well most of the single dimensions of the original KIDSCREEN-52. Test–retest reliability was slightly below a priori defined thresholds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque.