981 resultados para Test-problem Generator
Resumo:
Since the first underground nuclear explosion, carried out in 1958, the analysis of seismic signals generated by these sources has allowed seismologists to refine the travel times of seismic waves through the Earth and to verify the accuracy of the location algorithms (the ground truth for these sources was often known). Long international negotiates have been devoted to limit the proliferation and testing of nuclear weapons. In particular the Treaty for the comprehensive nuclear test ban (CTBT), was opened to signatures in 1996, though, even if it has been signed by 178 States, has not yet entered into force, The Treaty underlines the fundamental role of the seismological observations to verify its compliance, by detecting and locating seismic events, and identifying the nature of their sources. A precise definition of the hypocentral parameters represents the first step to discriminate whether a given seismic event is natural or not. In case that a specific event is retained suspicious by the majority of the State Parties, the Treaty contains provisions for conducting an on-site inspection (OSI) in the area surrounding the epicenter of the event, located through the International Monitoring System (IMS) of the CTBT Organization. An OSI is supposed to include the use of passive seismic techniques in the area of the suspected clandestine underground nuclear test. In fact, high quality seismological systems are thought to be capable to detect and locate very weak aftershocks triggered by underground nuclear explosions in the first days or weeks following the test. This PhD thesis deals with the development of two different seismic location techniques: the first one, known as the double difference joint hypocenter determination (DDJHD) technique, is aimed at locating closely spaced events at a global scale. The locations obtained by this method are characterized by a high relative accuracy, although the absolute location of the whole cluster remains uncertain. We eliminate this problem introducing a priori information: the known location of a selected event. The second technique concerns the reliable estimates of back azimuth and apparent velocity of seismic waves from local events of very low magnitude recorded by a trypartite array at a very local scale. For the two above-mentioned techniques, we have used the crosscorrelation technique among digital waveforms in order to minimize the errors linked with incorrect phase picking. The cross-correlation method relies on the similarity between waveforms of a pair of events at the same station, at the global scale, and on the similarity between waveforms of the same event at two different sensors of the try-partite array, at the local scale. After preliminary tests on the reliability of our location techniques based on simulations, we have applied both methodologies to real seismic events. The DDJHD technique has been applied to a seismic sequence occurred in the Turkey-Iran border region, using the data recorded by the IMS. At the beginning, the algorithm was applied to the differences among the original arrival times of the P phases, so the cross-correlation was not used. We have obtained that the relevant geometrical spreading, noticeable in the standard locations (namely the locations produced by the analysts of the International Data Center (IDC) of the CTBT Organization, assumed as our reference), has been considerably reduced by the application of our technique. This is what we expected, since the methodology has been applied to a sequence of events for which we can suppose a real closeness among the hypocenters, belonging to the same seismic structure. Our results point out the main advantage of this methodology: the systematic errors affecting the arrival times have been removed or at least reduced. The introduction of the cross-correlation has not brought evident improvements to our results: the two sets of locations (without and with the application of the cross-correlation technique) are very similar to each other. This can be commented saying that the use of the crosscorrelation has not substantially improved the precision of the manual pickings. Probably the pickings reported by the IDC are good enough to make the random picking error less important than the systematic error on travel times. As a further justification for the scarce quality of the results given by the cross-correlation, it should be remarked that the events included in our data set don’t have generally a good signal to noise ratio (SNR): the selected sequence is composed of weak events ( magnitude 4 or smaller) and the signals are strongly attenuated because of the large distance between the stations and the hypocentral area. In the local scale, in addition to the cross-correlation, we have performed a signal interpolation in order to improve the time resolution. The algorithm so developed has been applied to the data collected during an experiment carried out in Israel between 1998 and 1999. The results pointed out the following relevant conclusions: a) it is necessary to correlate waveform segments corresponding to the same seismic phases; b) it is not essential to select the exact first arrivals; and c) relevant information can be also obtained from the maximum amplitude wavelet of the waveforms (particularly in bad SNR conditions). Another remarkable point of our procedure is that its application doesn’t demand a long time to process the data, and therefore the user can immediately check the results. During a field survey, such feature will make possible a quasi real-time check allowing the immediate optimization of the array geometry, if so suggested by the results at an early stage.
Resumo:
BTES (borehole thermal energy storage)systems exchange thermal energy by conduction with the surrounding ground through borehole materials. The spatial variability of the geological properties and the space-time variability of hydrogeological conditions affect the real power rate of heat exchangers and, consequently, the amount of energy extracted from / injected into the ground. For this reason, it is not an easy task to identify the underground thermal properties to use when designing. At the current state of technology, Thermal Response Test (TRT) is the in situ test for the characterization of ground thermal properties with the higher degree of accuracy, but it doesn’t fully solve the problem of characterizing the thermal properties of a shallow geothermal reservoir, simply because it characterizes only the neighborhood of the heat exchanger at hand and only for the test duration. Different analytical and numerical models exist for the characterization of shallow geothermal reservoir, but they are still inadequate and not exhaustive: more sophisticated models must be taken into account and a geostatistical approach is needed to tackle natural variability and estimates uncertainty. The approach adopted for reservoir characterization is the “inverse problem”, typical of oil&gas field analysis. Similarly, we create different realizations of thermal properties by direct sequential simulation and we find the best one fitting real production data (fluid temperature along time). The software used to develop heat production simulation is FEFLOW 5.4 (Finite Element subsurface FLOW system). A geostatistical reservoir model has been set up based on literature thermal properties data and spatial variability hypotheses, and a real TRT has been tested. Then we analyzed and used as well two other codes (SA-Geotherm and FV-Geotherm) which are two implementation of the same numerical model of FEFLOW (Al-Khoury model).
Resumo:
The work presented in this thesis is focused on the open-ended coaxial-probe frequency-domain reflectometry technique for complex permittivity measurement at microwave frequencies of dispersive dielectric multilayer materials. An effective dielectric model is introduced and validated to extend the applicability of this technique to multilayer materials in on-line system context. In addition, the thesis presents: 1) a numerical study regarding the imperfectness of the contact at the probe-material interface, 2) a review of the available models and techniques, 3) a new classification of the extraction schemes with guidelines on how they can be used to improve the overall performance of the probe according to the problem requirements.
Resumo:
This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.
Resumo:
This dissertation presents the theory and the conducted activity that lead to the construction of a high voltage high frequency arbitrary waveform voltage generator. The generator has been specifically designed to supply power to a wide range of plasma actuators. The system has been completely designed, manufactured and tested at the Department of Electrical, Electronic and Information Engineering of the University of Bologna. The generator structure is based on the single phase cascaded H-bridge multilevel topology and is comprised of 24 elementary units that are series connected in order to form the typical staircase output voltage waveform of a multilevel converter. The total number of voltage levels that can be produced by the generator is 49. Each level is 600 V making the output peak-to-peak voltage equal to 28.8 kV. The large number of levels provides high resolution with respect to the output voltage having thus the possibility to generate arbitrary waveforms. Maximum frequency of operation is 20 kHz. A study of the relevant literature shows that this is the first time that a cascaded multilevel converter of such dimensions has been constructed. Isolation and control challenges had to be solved for the realization of the system. The biggest problem of the current technology in power supplies for plasma actuators is load matching. Resonant converters are the most used power supplies and are seriously affected by this problem. The manufactured generator completely solves this issue providing consistent voltage output independently of the connected load. This fact is very important when executing tests and during the comparison of the results because all measures should be comparable and not dependent from matching issues. The use of the multilevel converter for power supplying a plasma actuator is a real technological breakthrough that has provided and will continue to provide very significant experimental results.
Resumo:
La caratterizzazione di sedimenti contaminati è un problema complesso, in questo lavoro ci si è occupati di individuare una metodologia di caratterizzazione che tenesse conto sia delle caratteristiche della contaminazione, con analisi volte a determinare il contenuto totale di contaminanti, sia della mobilità degli inquinanti stessi. Una adeguata strategia di caratterizzazione può essere applicata per la valutazione di trattamenti di bonifica, a questo scopo si è valutato il trattamento di soil washing, andando ad indagare le caratteristiche dei sedimenti dragati e del materiale in uscita dal processo, sabbie e frazione fine, andando inoltre a confrontare le caratteristiche della sabbia in uscita con quelle delle sabbie comunemente usate per diverse applicazioni. Si è ritenuto necessario indagare la compatibilità dal punto di vista chimico, granulometrico e morfologico. Per indagare la mobilità si è scelto di applicare i test di cessione definiti sia a livello internazionale che italiano (UNI) e quindi si sono sviluppate le tecnologie necessarie alla effettuazione di test di cessione in modo efficace, automatizzando la gestione del test a pHstat UNI CEN 14997. Questo si è reso necessario a causa della difficoltà di gestire il test manualmente, per via delle tempistiche difficilmente attuabili da parte di un operatore. Le condizioni redox influenzano la mobilità degli inquinanti, in particolare l’invecchiamento all’aria di sedimenti anossici provoca variazioni sensibili nello stato d’ossidazione di alcune componenti, incrementandone la mobilità, si tratta quindi di un aspetto da considerare quando si individuano le adeguate condizioni di stoccaggio-smaltimento, si è eseguita a questo scopo una campagna sperimentale.
Resumo:
In technical design processes in the automotive industry, digital prototypes rapidly gain importance, because they allow for a detection of design errors in early development stages. The technical design process includes the computation of swept volumes for maintainability analysis and clearance checks. The swept volume is very useful, for example, to identify problem areas where a safety distance might not be kept. With the explicit construction of the swept volume an engineer gets evidence on how the shape of components that come too close have to be modified.rnIn this thesis a concept for the approximation of the outer boundary of a swept volume is developed. For safety reasons, it is essential that the approximation is conservative, i.e., that the swept volume is completely enclosed by the approximation. On the other hand, one wishes to approximate the swept volume as precisely as possible. In this work, we will show, that the one-sided Hausdorff distance is the adequate measure for the error of the approximation, when the intended usage is clearance checks, continuous collision detection and maintainability analysis in CAD. We present two implementations that apply the concept and generate a manifold triangle mesh that approximates the outer boundary of a swept volume. Both algorithms are two-phased: a sweeping phase which generates a conservative voxelization of the swept volume, and the actual mesh generation which is based on restricted Delaunay refinement. This approach ensures a high precision of the approximation while respecting conservativeness.rnThe benchmarks for our test are amongst others real world scenarios that come from the automotive industry.rnFurther, we introduce a method to relate parts of an already computed swept volume boundary to those triangles of the generator, that come closest during the sweep. We use this to verify as well as to colorize meshes resulting from our implementations.
Resumo:
In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.
Resumo:
Un ambito di largo interesse nel campo dell’elettronica, è quello delle misure di corrente a larga banda, in cui la richiesta per sistemi a basso costo è aumentata sensibilmente nel corso del tempo. La causa di questo interessamento è dovuto al più frequente utilizzo di sistemi switched power ad alta frequenza, che necessitano di sensori di correnti isolati e ad alte prestazioni. L’intero lavoro prende in considerazione un sensore di nuova generazione per le misure di corrente, capace di raggiungere prestazioni oltre lo stato dell’arte. L’elaborato presentato vede, quindi, come obiettivo la creazione di un setup stabile per effettuare misure sui chip prodotti. Partendo dallo studio di fattibilità, della componentistica necessaria e dei compromessi per mantenere i costi, si giunge ad una soluzione flessibile e adatta alle misurazioni richieste. L’elaborato partirà con una introduzione sugli effetti fisici e la descrizione del componente fondamentale, passando a test funzionali effettuati su setup provvisori atti a ottenere informazioni basilari per la prosecuzione del lavoro. Infine verranno concepite e realizzate due schede a circuiti stampati per rispondere alle esigenze di progetto.
Resumo:
In questa tesi viene presentato un nuovo metaeuristico per la risoluzione del Traveling Salesman Problem (TSP) simmetrico. Tale metodo, detto algoritmo bionomico, è una variante dell'algoritmo genetico che usa un metodo innovativo di generazione del parents set. Nella tesi vengono proposti diversi metodi di crossover specifici per il TSP ma che possono essere facilmente estesi per altri problemi di ottimizzazione combinatoria. Tali metodi sono stati sperimentati su un insieme di problemi test, i risultati computazionali mostrano l'efficienza dei metodi proposti. In particolare uno dei metodi domina gli altri sia per la miglior qualità delle soluzioni prodotte che per il minor tempo di calcolo impiegato.
Resumo:
In this article we propose a bootstrap test for the probability of ruin in the compound Poisson risk process. We adopt the P-value approach, which leads to a more complete assessment of the underlying risk than the probability of ruin alone. We provide second-order accurate P-values for this testing problem and consider both parametric and nonparametric estimators of the individual claim amount distribution. Simulation studies show that the suggested bootstrap P-values are very accurate and outperform their analogues based on the asymptotic normal approximation.
Resumo:
Burnside posed the question as to whether or not there exist groups having an external automorphism that behaves in a certain, specific way like an inner automorphism: we shall define such automorphisms to be nearly-inner. NI-groups are fairly rare. With the aid of the computer algebra system Magma - in particular with the aid of its small group database - we set out to test this hypothesis.
Resumo:
BACKGROUND: Congestive heart failure (CHF) is a major public health problem. The use of B-type natriuretic peptide (BNP) tests shows promising diagnostic accuracy. Herein, we summarize the evidence on the accuracy of BNP tests in the diagnosis of CHF and compare the performance of rapid enzyme-linked immunosorbent assay (ELISA) and standard radioimmunosorbent assay (RIA) tests. METHODS: We searched electronic databases and the reference lists of included studies, and we contacted experts. Data were extracted on the study population, the type of test used, and methods. Receiver operating characteristic (ROC) plots and summary ROC curves were produced and negative likelihood ratios pooled. Random-effect meta-analysis and metaregression were used to combine data and explore sources of between-study heterogeneity. RESULTS: Nineteen studies describing 22 patient populations (9 ELISA and 13 RIA) and 9093 patients were included. The diagnosis of CHF was verified by echocardiography, radionuclide scan, or echocardiography combined with clinical criteria. The pooled negative likelihood ratio overall from random-effect meta-analysis was 0.18 (95% confidence interval [CI], 0.13-0.23). It was lower for the ELISA test (0.12; 95% CI, 0.09-0.16) than for the RIA test (0.23; 95% CI, 0.16-0.32). For a pretest probability of 20%, which is typical for patients with suspected CHF in primary care, a negative result of the ELISA test would produce a posttest probability of 2.9%; a negative RIA test, a posttest probability of 5.4%. CONCLUSIONS: The use of BNP tests to rule out CHF in primary care settings could reduce demand for echocardiography. The advantages of rapid ELISA tests need to be balanced against their higher cost.
Resumo:
OBJECTIVE: Caring for a loved one with Alzheimer disease is a highly stressful experience that is associated with significant depressive symptoms. Previous studies indicate a positive association between problem behaviors in patients with Alzheimer disease (e.g., repeating questions, restlessness, and agitation) and depressive symptoms in their caregivers. Moreover, the extant literature indicates a robust negative relationship between escape-avoidance coping (i.e., avoiding people, wishing the situation would go away) and psychiatric well-being. The purpose of this study was to test a mediational model of the associations between patient problem behaviors, escape-avoidance coping, and depressive symptoms in Alzheimer caregivers. METHODS: Ninety-five spousal caregivers (mean age: 72 years) completed measures assessing their loved ones' frequency of problem behaviors, escape-avoidance coping, and depressive symptoms. A mediational model was tested to determine if escape-avoidant coping partially mediated the relationship between patient problem behaviors and caregiver depressive symptoms. RESULTS: Patient problem behaviors were positively associated with escape-avoidance coping (beta = 0.38, p < 0.01) and depressive symptoms (beta = 0.26, p < 0.05). Escape-avoidance coping was positively associated with depressive symptoms (beta = 0.33, p < 0.01). In a final regression analysis, the impact of problem behaviors on depressive symptoms was less after controlling for escape-avoidance coping. Sobel's test confirmed that escape-avoidance coping significantly mediated the relationship between problem behaviors and depressive symptoms (z = 2.07, p < 0.05). CONCLUSION: Escape-avoidance coping partially mediates the association between patient problem behaviors and depressive symptoms among elderly caregivers of spouses with dementia. This finding provides a specific target for psychosocial interventions for caregivers.
Resumo:
OBJECTIVES: Reactivation of latent tuberculosis (TB) in inflammatory bowel disease (IBD) patients treated with antitumor necrosis factor-alpha medication is a serious problem. Currently, TB screening includes chest x-rays and a tuberculin skin test (TST). The interferon-gamma release assay (IGRA) QuantiFERON-TB Gold In-Tube (QFT-G-IT) shows better specificity for diagnosing TB than the skin test. This study evaluates the two test methods among IBD patients. METHODS: Both TST and IGRA were performed on 212 subjects (114 Crohn's disease, 44 ulcerative colitis, 10 indeterminate colitis, 44 controls). RESULTS: Eighty-one percent of IBD patients were under immunosuppressive therapy; 71% of all subjects were vaccinated with Bacille Calmette Guérin; 18% of IBD patients and 43% of controls tested positive with the skin test (P < 0.0001). Vaccinated controls tested positive more often with the skin test (52%) than did vaccinated IBD patients (23%) (P = 0.011). Significantly fewer immunosuppressed patients tested positive with the skin test than did patients not receiving therapy (P = 0.007); 8% of patients tested positive with the QFT-G-IT test (14/168) compared to 9% (4/44) of controls. Test agreement was significantly higher in the controls (P = 0.044) compared to the IBD group. CONCLUSIONS: Agreement between the two test methods is poor in IBD patients. In contrast to the QFT-G-IT test, the TST is negatively influenced by immunosuppressive medication and vaccination status, and should thus be replaced by the IGRA for TB screening in immunosuppressed patients having IBD.