817 resultados para Error of measurement


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Acute respiratory distress syndrome (ARDS) is associated with high in-hospital mortality. Alveolar recruitment followed by ventilation at optimal titrated PEEP may reduce ventilator-induced lung injury and improve oxygenation in patients with ARDS, but the effects on mortality and other clinical outcomes remain unknown. This article reports the rationale, study design, and analysis plan of the Alveolar Recruitment for ARDS Trial (ART). Methods/Design: ART is a pragmatic, multicenter, randomized (concealed), controlled trial, which aims to determine if maximum stepwise alveolar recruitment associated with PEEP titration is able to increase 28-day survival in patients with ARDS compared to conventional treatment (ARDSNet strategy). We will enroll adult patients with ARDS of less than 72 h duration. The intervention group will receive an alveolar recruitment maneuver, with stepwise increases of PEEP achieving 45 cmH(2)O and peak pressure of 60 cmH2O, followed by ventilation with optimal PEEP titrated according to the static compliance of the respiratory system. In the control group, mechanical ventilation will follow a conventional protocol (ARDSNet). In both groups, we will use controlled volume mode with low tidal volumes (4 to 6 mL/kg of predicted body weight) and targeting plateau pressure <= 30 cmH2O. The primary outcome is 28-day survival, and the secondary outcomes are: length of ICU stay; length of hospital stay; pneumothorax requiring chest tube during first 7 days; barotrauma during first 7 days; mechanical ventilation-free days from days 1 to 28; ICU, in-hospital, and 6-month survival. ART is an event-guided trial planned to last until 520 events (deaths within 28 days) are observed. These events allow detection of a hazard ratio of 0.75, with 90% power and two-tailed type I error of 5%. All analysis will follow the intention-to-treat principle. Discussion: If the ART strategy with maximum recruitment and PEEP titration improves 28-day survival, this will represent a notable advance to the care of ARDS patients. Conversely, if the ART strategy is similar or inferior to the current evidence-based strategy (ARDSNet), this should also change current practice as many institutions routinely employ recruitment maneuvers and set PEEP levels according to some titration method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim. The aim of this study was to evaluate whether an association of elastic stockings and walking for a short period in the late afternoon reduces leg edema. Methods. Volume changes of the legs of sixteen patients (32 limbs), who walked on a treadmill for 30 minutes using elastic compression stockings, were analyzed in a quantitative, cross-over randomized (in order of arrival at the clinic) study. They were submitted to volumetry using the water displacement technique and subsequently required to put on 20/30 made-to-measure compression stockings (Sigvaris). The patients walked on a treadmill for 30 minutes and after removing the stockings volumetry of the legs was again performed. Legs were assessed using the CEAP classification and divided into groups. Analysis of variance was used for statistical analysis with an alpha error of 5% being considered acceptable. Results. When participants walked wearing compression stockings, there was a reduction in leg volume. When the CEAP classification was evaluated, it was noted that there was a statistically significant difference for the CEAP C0, C1 and C2 categories of legs using stockings compared to those that did not use. Conclusion. Compression stockings have a synergistic effect with walking in the late afternoon thus reducing edema of the lower limbs. [Int Angiol 2012;31:490-3]

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: Wilson's disease (WD) is an inborn error of metabolism caused by abnormalities of the copper-transporting protein encoding gene ATP7B. In this study, we examined ATP7B for mutations in a group of patients living in southern Brazil. METHODS: 36 WD subjects were studied and classified according to their clinical and epidemiological data. In 23 subjects the ATP7B gene was analyzed. RESULTS: Fourteen distinct mutations were detected in at least one of the alleles. The c.3207C>A substitution at exon 14 was the most common mutation (allelic frequency=37.1%) followed by the c.3402delC at exon 15 (allelic frequency=11.4%). The mutations c.2018-2030del13 at exon 7 and c.4093InsT at exon 20 are being reported for the first time. CONCLUSION: The c.3207C>A substitution at exon 14, was the most common mutation, with an allelic frequency of 37.1%. This mutation is the most common mutation described in Europe.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the context of “testing laboratory” one of the most important aspect to deal with is the measurement result. Whenever decisions are based on measurement results, it is important to have some indication of the quality of the results. In every area concerning with noise measurement many standards are available but without an expression of uncertainty, it is impossible to judge whether two results are in compliance or not. ISO/IEC 17025 is an international standard related with the competence of calibration and testing laboratories. It contains the requirements that testing and calibration laboratories have to meet if they wish to demonstrate that they operate to a quality system, are technically competent and are able to generate technically valid results. ISO/IEC 17025 deals specifically with the requirements for the competence of laboratories performing testing and calibration and for the reporting of the results, which may or may not contain opinions and interpretations of the results. The standard requires appropriate methods of analysis to be used for estimating uncertainty of measurement. In this point of view, for a testing laboratory performing sound power measurement according to specific ISO standards and European Directives, the measurement of uncertainties is the most important factor to deal with. Sound power level measurement, according to ISO 3744:1994 , performed with a limited number of microphones distributed over a surface enveloping a source is affected by a certain systematic error and a related standard deviation. Making a comparison of measurement carried out with different microphone arrays is difficult because results are affected by systematic errors and standard deviation that are peculiarities of the number of microphones disposed on the surface, their spatial position and the complexity of the sound field. A statistical approach could give an overview of the difference between sound power level evaluated with different microphone arrays and an evaluation of errors that afflict this kind of measurement. Despite the classical approach that tend to follow the ISO GUM this thesis present a different point of view of the problem related to the comparison of result obtained from different microphone arrays.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Das Ziel des Experiments NA48 am CERN ist die Messung des Parameters Re(epsilon'/epsilon) der direktenCP-Verletzung mit einer Genauigkeit von 2x10^-4. Experimentell zugänglich ist das DoppelverhältnisR, das aus den Zerfällen des KL und KS in zwei neutrale bzw. zwei geladene Pionengebildet wird. Für R gilt in guter Näherung: R=1-6Re(epsilon'/epsilon).
NA48 verwendet eine Wichtung der KL-Ereignisse zur Reduzierung der Sensitivität auf dieDetektorakzeptanz. Zur Kontrolle der bisherigen Standardanalyse wurde eine Analyse ohne Ereigniswichtung durchgeführt. Das Ergebnis derungewichteten Analyse wird in dieser Arbeit vorgestellt. Durch Verzicht auf die Ereigniswichtung kann derstatistische Anteil des Gesamtfehlers deutlich verringert werden. Da der limitierende Kanal der Zerfall deslanglebigen Kaons in zwei neutrale Pionen ist, ist die Verwendung der gesamten Anzahl derKL-Zerfälle ein lohnendes Ziel.Im Laufe dieser Arbeit stellte sich heraus, dass dersystematische Fehler der Akzeptanzkorrektur diesen Gewinn wieder aufhebt.

Das Ergebnis der Arbeit für die Daten aus den Jahren 1998und 1999 ohne Ereigniswichtung lautet
Re(epsilon'/epsilon)=(17,91+-4,41(syst.)+-1,36(stat.))x10^-4.
Damit ist eindeutig die Existenz der direkten CP-Verletzungbestätigt. Dieses Ergebnis ist mit dem veröffentlichten Ergebnis vonNA48 verträglichSomit ist der Test der bisherigen Analysestrategie bei NA48erfolgreich durchgeführt worden.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

CONCLUSIONS The focus of this work was the investigation ofanomalies in Tg and dynamics at polymer surfaces. Thethermally induced decay of hot-embossed polymer gratings isstudied using laser-diffraction and atomic force microscopy(AFM). Monodisperse PMMA and PS are selected in the Mwranges of 4.2 to 65.0 kg/mol and 3.47 to 65.0 kg/mol,respectively. Two different modes of measurement were used:the one mode uses temperature ramps to obtain an estimate ofthe near-surface glass temperature, Tdec,0; the other modeinvestigates the dynamics at a constant temperature aboveTg. The temperature-ramp experiments reveal Tdec,0 valuesvery close to the Tg,bulk values, as determined bydifferential scanning calorimetry (DSC). The PMMA of65.0 kg/mol shows a decreased value of Tg, while the PS samples of 3.47 and 10.3 kg/mol (Mwof an amplitude reduction and a finalstate is smaller in the case of PS than in the case of PMMA.This suggests a higher degree of cooperation between thepolymer chains for PMMA than for PS chains even near thesurface. A reduction of the investigated near-surface region byusing smaller grating constants and AFM did not show achange in the near-surface Tdec,0. Decay experiments were performed at a variety ofconstant temperatures. Master plots are produced by shifting the decay curves, on the logarithmic time scale,with respect to a reference curve at Tref. From thisprocedure shift factors were extracted. An Arrhenius analysis of the shift factors reveals adecreasing non-Arrhenius (fragile) behavior with molecular weight for PMMA. PS is fragile for all Mw asexpected for linear polymers. Non-Arrhenius behavior allowsone to fit the shift factors to the William-Landel-Ferry(WLF) equation. The WLF parameters for the varying molecular weights ofPMMA and PS were extracted and compared to the values frombulk rheology measurements. Assuming cg1=16+/-2 at Tg, assuggested by Angell, the glass temperature was determinedfrom the dynamic decay experiments. Within the experimentalerrors, the values for Tg,surf(c1=16) and T_g,bulk(c1=16)tend to be smaller than Tdec,0 and Tg,bulk fromtemperature-ramp and DSC measurements, but confirm thecourse of the values with increasing Mw. The comparison of the fragilities (temperaturedependence of the polymer properties at Tg) near the surfaceand in the bulk shows a higher fragility for PS near thesurface, a lower one for PMMA with molecular weights of 4.2and 65.0 kg/mol. The different surface behavior of PS istraced back to a lower degree of cooperation and a largerfree volume fraction.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Researches performed during the PhD course intended to assess innovative applications of near-infrared spectroscopy in reflectance (NIR) in the production chain of beer. The purpose is to measure by NIR the "malting quality" (MQ) parameter of barley, to monitor the malting process and to know if a certain type of barley is suitable for the production of beer and spirits. Moreover, NIR will be applied to monitor the brewing process. First of all, it was possible to check the quality of the raw materials like barley, maize and barley malt using a rapid, non-destructive and reliable method, with a low error of prediction. The more interesting result obtained at this level was that the repeatability of the NIR calibration models developed was comparable with the one of the reference method. Moreover, about malt, new kinds of validation were used in order to estimate the real predictive power of the proposed calibration models and to understand the long-term effects. Furthermore, the precision of all the calibration models developed for malt evaluation was estimated and statistically compared with the reference methods, with good results. Then, new calibration models were developed for monitoring the malting process, measuring the moisture content and other malt quality parameters during germination. Moreover it was possible to obtain by NIR an estimate of the "malting quality" (MQ) of barley and to predict whether if its germination will be rapid and uniform and if a certain type of barley is suitable for the production of beer and spirits. Finally, the NIR technique was applied to monitor the brewing process, using correlations between NIR spectra of beer and analytical parameters, and to assess beer quality. These innovative results are potentially very useful for the actors involved in the beer production chain, especially the calibration models suitable for the control of the malting process and for the assessment of the “malting quality” of barley, which need to be deepened in future studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spannungsumlagerungen in Mineralen und Gesteinen induzieren in geologisch aktiven Bereichen mikromechanische und seismische Prozesse, wodurch eine schwache natürliche elektromagnetische Strahlung im Niederfrequenzbereich emittiert wird. Die elektromagnetischen Emissionen von nichtleitenden Mineralen sind auf dielektrische Polarisation durch mehrere physikalische Effekte zurückzuführen. Eine gerichtete mechanische Spannung führt zu einer ebenso gerichteten elektromagnetischen Emission. Die Quellen der elektromagnetischen Emissionen sind bekannt, jedoch können sie noch nicht eindeutig den verschiedenen Prozessen in der Natur zugeordnet werden, weshalb im Folgenden von einem seismo-elektromagnetischen Phänomen (SEM) gesprochen wird. Mit der neuentwickelten NPEMFE-Methode (Natural Pulsed Electromagnetic Field of Earth) können die elektromagnetischen Impulse ohne Bodenkontakt registriert werden. Bereiche der Erdkruste mit Spannungsumlagerungen (z.B. tektonisch aktive Störungen, potenzielle Hangrutschungen, Erdfälle, Bergsenkungen, Firstschläge) können als Anomalie erkannt und abgegrenzt werden. Basierend auf dem heutigen Kenntnisstand dieser Prozesse wurden Hangrutschungen und Locker- und Festgesteine, in denen Spannungsumlagerungen stattfinden, mit einem neuentwickelten Messgerät, dem "Cereskop", im Mittelgebirgsraum (Rheinland-Pfalz, Deutschland) und im alpinen Raum (Vorarlberg, Österreich, und Fürstentum Liechtenstein) erkundet und die gewonnenen Messergebnisse mit klassischen Verfahren aus Ingenieurgeologie, Geotechnik und Geophysik in Bezug gesetzt. Unter Feldbedingungen zeigte sich großenteils eine gute Übereinstimmung zwischen den mit dem "Cereskop" erkundeten Anomalien und den mit den konventionellen Verfahren erkundeten Spannungszonen. Auf Grundlage der bisherigen Kenntnis und unter Einbeziehung von Mehrdeutigkeiten werden die Messergebnisse analysiert und kritisch beurteilt.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background. Phenylketonuria is the most prevalent inborn error of aminoacid metabolism. Is an autosomal recessive disorder. It results from mutations in the phenylalanine hydroxilase (PAH) gene. Phenotypes can vary from mild hyperphenylalaninemia to a severe phenylketonuria wich, if untreated, results in severe mental retardation. Thanks to neonatal screening programmes, early detection and promp dietetic intervention (phenylalanine restricted diet lifelong) has allowed to avoid neurocognitive complications. Recently, a new therapy is become widely used: the oral supplementation with the PAH cofactor (BH4), wich can alleviate the diet burden. Genotype-phenotype correlation is a reliable tool to predict metabolic phenotype in order to establish a better tailored diet and to assess the potential responsiveness to BH4 therapy. Aim Molecular analysis of the PAH gene, evaluation of genotype-phenotype correlation and prediction of BH4 responsiveness in a group of HPA patients living in Emilia Romagna. Patients and methods. We studied 48 patients affected by PAH deficiency in regular follow-up to our Metabolic Centre. We performed the molecular analysis of these patients using genomic DNA extracted from peripheral blood samples Results. We obtained a full genotipic characterization of 46 patients. We found 87 mutant alleles and 35 different mutations, being the most frequent IVS10-11 G>A (19.3%), R261Q (9.1%), R158Q (9.1%), R408Q (6.8%) and A403V (5.7%), including 2 new ones (L287, N223Y) ever described previously. Notably, we found 15 mutations already identified in BH4-responsive patients, according to the literature. We found 42 different genotipic combinations, most of them in single patients and involving a BH4-responsive mutation. Conclusion. BH4 responsiveness is shown by a consistent number of PAH deficient hyperphenylalaninemic patients. This treatment, combined with a less restricted diet or as monotherapy, can reduce nutritional complications and improve the quality of life of these patients.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is a collection of essays related to the topic of innovation in the service sector. The choice of this structure is functional to the purpose of single out some of the relevant issues and try to tackle them, revising first the state of the literature and then proposing a way forward. Three relevant issues has been therefore selected: (i) the definition of innovation in the service sector and the connected question of measurement of innovation; (ii) the issue of productivity in services; (iii) the classification of innovative firms in the service sector. Facing the first issue, chapter II shows how the initial width of the original Schumpeterian definition of innovation has been narrowed and then passed to the service sector form the manufacturing one in a reduce technological form. Chapter III tackle the issue of productivity in services, discussing the difficulties for measuring productivity in a context where the output is often immaterial. We reconstruct the dispute on the Baumol’s cost disease argument and propose two different ways to go forward in the research on productivity in services: redefining the output along the line of a characteristic approach; and redefining the inputs, particularly analysing which kind of input it’s worth saving. Chapter IV derives an integrated taxonomy of innovative service and manufacturing firms, using data coming from the 2008 CIS survey for Italy. This taxonomy is based on the enlarged definition of “innovative firm” deriving from the Schumpeterian definition of innovation and classify firms using a cluster analysis techniques. The result is the emergence of a four cluster solution, where firms are differentiated by the breadth of the innovation activities in which they are involved. Chapter 5 reports some of the main conclusions of each singular previous chapter and the points worth of further research in the future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Die Verifikation numerischer Modelle ist für die Verbesserung der Quantitativen Niederschlagsvorhersage (QNV) unverzichtbar. Ziel der vorliegenden Arbeit ist die Entwicklung von neuen Methoden zur Verifikation der Niederschlagsvorhersagen aus dem regionalen Modell der MeteoSchweiz (COSMO-aLMo) und des Globalmodells des Europäischen Zentrums für Mittelfristvorhersage (engl.: ECMWF). Zu diesem Zweck wurde ein neuartiger Beobachtungsdatensatz für Deutschland mit stündlicher Auflösung erzeugt und angewandt. Für die Bewertung der Modellvorhersagen wurde das neue Qualitätsmaß „SAL“ entwickelt. Der neuartige, zeitlich und räumlich hoch-aufgelöste Beobachtungsdatensatz für Deutschland wird mit der während MAP (engl.: Mesoscale Alpine Program) entwickelten Disaggregierungsmethode erstellt. Die Idee dabei ist, die zeitlich hohe Auflösung der Radardaten (stündlich) mit der Genauigkeit der Niederschlagsmenge aus Stationsmessungen (im Rahmen der Messfehler) zu kombinieren. Dieser disaggregierte Datensatz bietet neue Möglichkeiten für die quantitative Verifikation der Niederschlagsvorhersage. Erstmalig wurde eine flächendeckende Analyse des Tagesgangs des Niederschlags durchgeführt. Dabei zeigte sich, dass im Winter kein Tagesgang existiert und dies vom COSMO-aLMo gut wiedergegeben wird. Im Sommer dagegen findet sich sowohl im disaggregierten Datensatz als auch im COSMO-aLMo ein deutlicher Tagesgang, wobei der maximale Niederschlag im COSMO-aLMo zu früh zwischen 11-14 UTC im Vergleich zu 15-20 UTC in den Beobachtungen einsetzt und deutlich um das 1.5-fache überschätzt wird. Ein neues Qualitätsmaß wurde entwickelt, da herkömmliche, gitterpunkt-basierte Fehlermaße nicht mehr der Modellentwicklung Rechnung tragen. SAL besteht aus drei unabhängigen Komponenten und basiert auf der Identifikation von Niederschlagsobjekten (schwellwertabhängig) innerhalb eines Gebietes (z.B. eines Flusseinzugsgebietes). Berechnet werden Unterschiede der Niederschlagsfelder zwischen Modell und Beobachtungen hinsichtlich Struktur (S), Amplitude (A) und Ort (L) im Gebiet. SAL wurde anhand idealisierter und realer Beispiele ausführlich getestet. SAL erkennt und bestätigt bekannte Modelldefizite wie das Tagesgang-Problem oder die Simulation zu vieler relativ schwacher Niederschlagsereignisse. Es bietet zusätzlichen Einblick in die Charakteristiken der Fehler, z.B. ob es sich mehr um Fehler in der Amplitude, der Verschiebung eines Niederschlagsfeldes oder der Struktur (z.B. stratiform oder kleinskalig konvektiv) handelt. Mit SAL wurden Tages- und Stundensummen des COSMO-aLMo und des ECMWF-Modells verifiziert. SAL zeigt im statistischen Sinne speziell für stärkere (und damit für die Gesellschaft relevante Niederschlagsereignisse) eine im Vergleich zu schwachen Niederschlägen gute Qualität der Vorhersagen des COSMO-aLMo. Im Vergleich der beiden Modelle konnte gezeigt werden, dass im Globalmodell flächigere Niederschläge und damit größere Objekte vorhergesagt werden. Das COSMO-aLMo zeigt deutlich realistischere Niederschlagsstrukturen. Diese Tatsache ist aufgrund der Auflösung der Modelle nicht überraschend, konnte allerdings nicht mit herkömmlichen Fehlermaßen gezeigt werden. Die im Rahmen dieser Arbeit entwickelten Methoden sind sehr nützlich für die Verifikation der QNV zeitlich und räumlich hoch-aufgelöster Modelle. Die Verwendung des disaggregierten Datensatzes aus Beobachtungen sowie SAL als Qualitätsmaß liefern neue Einblicke in die QNV und lassen angemessenere Aussagen über die Qualität von Niederschlagsvorhersagen zu. Zukünftige Anwendungsmöglichkeiten für SAL gibt es hinsichtlich der Verifikation der neuen Generation von numerischen Wettervorhersagemodellen, die den Lebenszyklus hochreichender konvektiver Zellen explizit simulieren.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present study concerns the acoustical characterisation of Italian historical theatres. It moved from the ISO 3382 which provides the guidelines for the measurement of a well established set of room acoustic parameters inside performance spaces. Nevertheless, the peculiarity of Italian historical theatres needs a more specific approach. The Charter of Ferrara goes in this direction, aiming at qualifying the sound field in this kind of halls and the present work pursues the way forward. Trying to understand how the acoustical qualification should be done, the Bonci Theatre in Cesena has been taken as a case study. In September 2012 acoustical measurements were carried out in the theatre, recording monaural e binaural impulse responses at each seat in the hall. The values of the time criteria, energy criteria and psycho-acoustical and spatial criteria have been extracted according to ISO 3382. Statistics were performed and a 3D model of the theatre was realised and tuned. Statistical investigations were carried out on the whole set of measurement positions and on carefully chosen reduced subsets; it turned out that these subsets are representative only of the “average” acoustics of the hall. Normality tests were carried out to verify whether EDT, T30 and C80 could be described with some degree of reliability with a theoretical distribution. Different results, according to the varying assumptions underlying each test, were found. Finally, an attempt was made to correlate the numerical results emerged from the statistical analysis to the perceptual sphere. Looking for “acoustical equivalent areas”, relative difference limens were considered as threshold values. No rule of thumb emerged. Finally, the significance of the usual representation through mean values and standard deviation, which may be meaningful for normal distributed data, was investigated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Massive parallel robots (MPRs) driven by discrete actuators are force regulated robots that undergo continuous motions despite being commanded through a finite number of states only. Designing a real-time control of such systems requires fast and efficient methods for solving their inverse static analysis (ISA), which is a challenging problem and the subject of this thesis. In particular, five Artificial intelligence methods are proposed to investigate the on-line computation and the generalization error of ISA problem of a class of MPRs featuring three-state force actuators and one degree of revolute motion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Das aSPECT Spektrometer wurde entworfen, um das Spektrum der Protonen beimrnZerfall freier Neutronen mit hoher Präzision zu messen. Aus diesem Spektrum kann dann der Elektron-Antineutrino Winkelkorrelationskoeffizient "a" mit hoher Genauigkeit bestimmt werden. Das Ziel dieses Experiments ist es, diesen Koeffizienten mit einem absoluten relativen Fehler von weniger als 0.3% zu ermitteln, d.h. deutlich unter dem aktuellen Literaturwert von 5%.rnrnErste Messungen mit dem aSPECT Spektrometer wurden an der Forschungsneutronenquelle Heinz Maier-Leibnitz in München durchgeführt. Jedoch verhinderten zeitabhängige Instabilitäten des Meßhintergrunds eine neue Bestimmung von "a".rnrnDie vorliegende Arbeit basiert hingegen auf den letzten Messungen mit dem aSPECTrnSpektrometer am Institut Laue-Langevin (ILL) in Grenoble, Frankreich. Bei diesen Messungen konnten die Instabilitäten des Meßhintergrunds bereits deutlich reduziert werden. Weiterhin wurden verschiedene Veränderungen vorgenommen, um systematische Fehler zu minimieren und um einen zuverlässigeren Betrieb des Experiments sicherzustellen. Leider konnte aber wegen zu hohen Sättigungseffekten der Empfängerelektronik kein brauchbares Ergebnis gemessen werden. Trotzdem konnten diese und weitere systematische Fehler identifiziert und verringert, bzw. sogar teilweise eliminiert werden, wovon zukünftigernStrahlzeiten an aSPECT profitieren werden.rnrnDer wesentliche Teil der vorliegenden Arbeit befasst sich mit der Analyse und Verbesserung der systematischen Fehler, die durch das elektromagnetische Feld aSPECTs hervorgerufen werden. Hieraus ergaben sich vielerlei Verbesserungen, insbesondere konnten die systematischen Fehler durch das elektrische Feld verringert werden. Die durch das Magnetfeld verursachten Fehler konnten sogar soweit minimiert werden, dass nun eine Verbesserung des aktuellen Literaturwerts von "a" möglich ist. Darüber hinaus wurde in dieser Arbeit ein für den Versuch maßgeschneidertes NMR-Magnetometer entwickelt und soweit verbessert, dass nun Unsicherheiten bei der Charakterisierung des Magnetfeldes soweit reduziert wurden, dass sie für die Bestimmung von "a" mit einer Genauigkeit von mindestens 0.3% vernachlässigbar sind.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In technical design processes in the automotive industry, digital prototypes rapidly gain importance, because they allow for a detection of design errors in early development stages. The technical design process includes the computation of swept volumes for maintainability analysis and clearance checks. The swept volume is very useful, for example, to identify problem areas where a safety distance might not be kept. With the explicit construction of the swept volume an engineer gets evidence on how the shape of components that come too close have to be modified.rnIn this thesis a concept for the approximation of the outer boundary of a swept volume is developed. For safety reasons, it is essential that the approximation is conservative, i.e., that the swept volume is completely enclosed by the approximation. On the other hand, one wishes to approximate the swept volume as precisely as possible. In this work, we will show, that the one-sided Hausdorff distance is the adequate measure for the error of the approximation, when the intended usage is clearance checks, continuous collision detection and maintainability analysis in CAD. We present two implementations that apply the concept and generate a manifold triangle mesh that approximates the outer boundary of a swept volume. Both algorithms are two-phased: a sweeping phase which generates a conservative voxelization of the swept volume, and the actual mesh generation which is based on restricted Delaunay refinement. This approach ensures a high precision of the approximation while respecting conservativeness.rnThe benchmarks for our test are amongst others real world scenarios that come from the automotive industry.rnFurther, we introduce a method to relate parts of an already computed swept volume boundary to those triangles of the generator, that come closest during the sweep. We use this to verify as well as to colorize meshes resulting from our implementations.