879 resultados para 3D printing,steel bars,calibration of design values,correlation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal effects are rapidly gaining importance in nanometer heterogeneous integrated systems. Increased power density, coupled with spatio-temporal variability of chip workload, cause lateral and vertical temperature non-uniformities (variations) in the chip structure. The assumption of an uniform temperature for a large circuit leads to inaccurate determination of key design parameters. To improve design quality, we need precise estimation of temperature at detailed spatial resolution which is very computationally intensive. Consequently, thermal analysis of the designs needs to be done at multiple levels of granularity. To further investigate the flow of chip/package thermal analysis we exploit the Intel Single Chip Cloud Computer (SCC) and propose a methodology for calibration of SCC on-die temperature sensors. We also develop an infrastructure for online monitoring of SCC temperature sensor readings and SCC power consumption. Having the thermal simulation tool in hand, we propose MiMAPT, an approach for analyzing delay, power and temperature in digital integrated circuits. MiMAPT integrates seamlessly into industrial Front-end and Back-end chip design flows. It accounts for temperature non-uniformities and self-heating while performing analysis. Furthermore, we extend the temperature variation aware analysis of designs to 3D MPSoCs with Wide-I/O DRAM. We improve the DRAM refresh power by considering the lateral and vertical temperature variations in the 3D structure and adapting the per-DRAM-bank refresh period accordingly. We develop an advanced virtual platform which models the performance, power, and thermal behavior of a 3D-integrated MPSoC with Wide-I/O DRAMs in detail. Moving towards real-world multi-core heterogeneous SoC designs, a reconfigurable heterogeneous platform (ZYNQ) is exploited to further study the performance and energy efficiency of various CPU-accelerator data sharing methods in heterogeneous hardware architectures. A complete hardware accelerator featuring clusters of OpenRISC CPUs, with dynamic address remapping capability is built and verified on a real hardware.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last years the attentions on the energy efficiency on historical buildings grows, as different research project took place across Europe. The attention on combining, the need of the preservation of the buildings, their value and their characteristic, with the need of the reduction of energy consumption and the improvements of indoor comfort condition, stimulate the discussion of two points of view that are usually in contradiction, buildings engineer and Conservation Institution. The results are surprising because a common field is growing while remains the need of balancing the respective exigencies. From these experience results clear that many questions should be answered also from the building physicist regarding the correct assessment: on the energy consumption of this class of buildings, on the effectiveness of the measures that could be adopted, and much more. This thesis gives a contribution to answer to these questions developing a procedure to analyse the historic building. The procedure gives a guideline of the energy audit for the historical building considering the experimental activities to dial with the uncertainty of the estimation of the energy balance. It offers a procedure to simulate the energy balance of building with a validated dynamic model considering also a calibration procedure to increase the accuracy of the model. An approach of design of energy efficiency measures through an optimization that consider different aspect is also presented. All the process is applied to a real case study to give to the reader a practical understanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interaction between aerosols and sun light plays an important role in the radiative balance of Earth’s atmosphere. This interaction is obtained by measuring the removal (extinction), redistribution (scattering), and transformation into heat (absorption) of light by the aerosols; i.e. their optical properties. Knowledge of these properties is crucial for our understanding of the atmospheric system. rn Light absorption by aerosols is a major contributor to the direct and indirect effects on our climate system, and an accurate and sensitive measurement method is crucial to further our understanding. A homebuilt photoacoustic sensor (PAS), measuring at a 532nm wavelength, was fully characterized and its functionality validated for measurements of absorbing aerosols. The optical absorption cross-sections of absorbing polystyrene latex spheres, to be used as a standard for aerosol absorption measurements, were measured and compared to literature values. Additionally, a calibration method using absorbing aerosol of known complex refractive index was presented.rn A new approach to retrieve the effective broadband refractive indices (mbroad,eff) of aerosol particles by a white light aerosol spectrometer (WELAS) optical particle counter (OPC) was achieved. Using a tandem differential mobility analyzer (DMA)-OPC system, the nbroad,eff are obtained for both laboratory and field applications. This method was tested in the laboratory using substances with a wide range of optical properties and it was used in ambient measurements to retrieve the nbroad,eff of biomass burning aerosols in a nationwide burning event in Israel. The retrieved effective broadband refractive indices for laboratory generated scattering aerosols were: ammonium sulfate (AS), glutaric acid (GA), and sodium chloride, all within 4% of literature values. For absorbing substances, nigrosine and various mixtures of nigrosine with AS and GA were measured, as well as a lightly absorbing substance, Suwannee river fulvic acid (SRFA). For the ambient measurements, the calibration curves generated from this method were to follow the optical evolution of biomass burning (BB) aerosols. A decrease in the overall aerosol absorption and scattering for aged aerosols during the day after the fires compared to the smoldering phase of the fires was found. rn The connection between light extinction of aerosols, their chemical composition and hygroscopicity for particles with different degrees of absorption was studied. The extinction cross-section (σext) at 532nm for different mobility diameters was measured at 80% and 90% relative humidity (RH), and at an RH<10%. The ratio of the humidified aerosols to the dry ones, fRHext(%RH,Dry), is presented. For purely scattering aerosols, fRHext(%RH,Dry) is inversely proportional with size; this dependence was suppressed for lightly absorbing ones. In addition, the validity of the mixing rules for water soluble absorbing aerosols is explored. The difference between the derived and calculated real parts of the complex RIs were less than 5.3% for all substances, wavelengths, and RHs. The obtained imaginary parts for the retrieved and calculated RIs were in good agreement with each other, and well within the measurement errors of retrieval from pulsed CRD spectroscopy measurements. Finally, a core-shell structure model is also used to explore the differences between the models, for substances with low growth factors, under these hydration conditions. It was found that at 80% RH and for size parameters less than 2.5, there is less than a 5 % difference between the extinction efficiencies calculated with both models. This difference is within measurement errors; hence, there is no significant difference between the models in this case. However, for greater size parameters the difference can be up to 10%. For 90% RH the differences below a size parameter of 2.5 were up to 7%.rn Finally, the fully characterized PAS together with a cavity ring down spectrometer (CRD), were used to study the optical properties of soot and secondary organic aerosol (SOA) during the SOOT-11 project in the AIDA chamber in Karlsruhe, Germany. The fresh fractal-like soot particles were allowed to coagulate for 28 hours before stepwise coating them with SOA. The single scattering albedo for fresh fractal-like soot was measured to be 0.2 (±0.03), and after allowing the soot to coagulate for 28 hours and coating it with SOA, it increased to 0.71(±0.01). An absorption enhancement of the coated soot of up to 1.71 (±0.03) times from the non-coated coagulated soot was directly measured with the PAS. Monodisperse measurements of SOA and soot coated with SOA were performed to derive the complex refractive index (m) of both aerosols. A complex refractive index of m = 1.471(±0.008) + i0.0(±0.002) for the SOA-αO3 was retrieved. For the compact coagulated soot a preliminary complex refractive index of m = 2.04(+0.21/-0.14) + i0.34(+0.18/-0.06) with 10nm(+4/-6) coating thickness was retrieved.rn These detail properties can be use by modelers to decrease uncertainties in assessing climatic impacts of the different species and to improve weather forecasting.rn

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Atmosphärische Aerosolpartikel wirken in vielerlei Hinsicht auf die Menschen und die Umwelt ein. Eine genaue Charakterisierung der Partikel hilft deren Wirken zu verstehen und dessen Folgen einzuschätzen. Partikel können hinsichtlich ihrer Größe, ihrer Form und ihrer chemischen Zusammensetzung charakterisiert werden. Mit der Laserablationsmassenspektrometrie ist es möglich die Größe und die chemische Zusammensetzung einzelner Aerosolpartikel zu bestimmen. Im Rahmen dieser Arbeit wurde das SPLAT (Single Particle Laser Ablation Time-of-flight mass spectrometer) zur besseren Analyse insbesondere von atmosphärischen Aerosolpartikeln weiterentwickelt. Der Aerosoleinlass wurde dahingehend optimiert, einen möglichst weiten Partikelgrößenbereich (80 nm - 3 µm) in das SPLAT zu transferieren und zu einem feinen Strahl zu bündeln. Eine neue Beschreibung für die Beziehung der Partikelgröße zu ihrer Geschwindigkeit im Vakuum wurde gefunden. Die Justage des Einlasses wurde mithilfe von Schrittmotoren automatisiert. Die optische Detektion der Partikel wurde so verbessert, dass Partikel mit einer Größe < 100 nm erfasst werden können. Aufbauend auf der optischen Detektion und der automatischen Verkippung des Einlasses wurde eine neue Methode zur Charakterisierung des Partikelstrahls entwickelt. Die Steuerelektronik des SPLAT wurde verbessert, so dass die maximale Analysefrequenz nur durch den Ablationslaser begrenzt wird, der höchsten mit etwa 10 Hz ablatieren kann. Durch eine Optimierung des Vakuumsystems wurde der Ionenverlust im Massenspektrometer um den Faktor 4 verringert.rnrnNeben den hardwareseitigen Weiterentwicklungen des SPLAT bestand ein Großteil dieser Arbeit in der Konzipierung und Implementierung einer Softwarelösung zur Analyse der mit dem SPLAT gewonnenen Rohdaten. CRISP (Concise Retrieval of Information from Single Particles) ist ein auf IGOR PRO (Wavemetrics, USA) aufbauendes Softwarepaket, das die effiziente Auswertung der Einzelpartikel Rohdaten erlaubt. CRISP enthält einen neu entwickelten Algorithmus zur automatischen Massenkalibration jedes einzelnen Massenspektrums, inklusive der Unterdrückung von Rauschen und von Problemen mit Signalen die ein intensives Tailing aufweisen. CRISP stellt Methoden zur automatischen Klassifizierung der Partikel zur Verfügung. Implementiert sind k-means, fuzzy-c-means und eine Form der hierarchischen Einteilung auf Basis eines minimal aufspannenden Baumes. CRISP bietet die Möglichkeit die Daten vorzubehandeln, damit die automatische Einteilung der Partikel schneller abläuft und die Ergebnisse eine höhere Qualität aufweisen. Daneben kann CRISP auf einfache Art und Weise Partikel anhand vorgebener Kriterien sortieren. Die CRISP zugrundeliegende Daten- und Infrastruktur wurde in Hinblick auf Wartung und Erweiterbarkeit erstellt. rnrnIm Rahmen der Arbeit wurde das SPLAT in mehreren Kampagnen erfolgreich eingesetzt und die Fähigkeiten von CRISP konnten anhand der gewonnen Datensätze gezeigt werden.rnrnDas SPLAT ist nun in der Lage effizient im Feldeinsatz zur Charakterisierung des atmosphärischen Aerosols betrieben zu werden, während CRISP eine schnelle und gezielte Auswertung der Daten ermöglicht.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The kinematics is a fundamental tool to infer the dynamical structure of galaxies and to understand their formation and evolution. Spectroscopic observations of gas emission lines are often used to derive rotation curves and velocity dispersions. It is however difficult to disentangle these two quantities in low spatial-resolution data because of beam smearing. In this thesis, we present 3D-Barolo, a new software to derive the gas kinematics of disk galaxies from emission-line data-cubes. The code builds tilted-ring models in the 3D observational space and compares them with the actual data-cubes. 3D-Barolo works with data at a wide range of spatial resolutions without being affected by instrumental biases. We use 3D-Barolo to derive rotation curves and velocity dispersions of several galaxies in both the local and the high-redshift Universe. We run our code on HI observations of nearby galaxies and we compare our results with 2D traditional approaches. We show that a 3D approach to the derivation of the gas kinematics has to be preferred to a 2D approach whenever a galaxy is resolved with less than about 20 elements across the disk. We moreover analyze a sample of galaxies at z~1, observed in the H-alpha line with the KMOS/VLT spectrograph. Our 3D modeling reveals that the kinematics of these high-z systems is comparable to that of local disk galaxies, with steeply-rising rotation curves followed by a flat part and H-alpha velocity dispersions of 15-40 km/s over the whole disks. This evidence suggests that disk galaxies were already fully settled about 7-8 billion years ago. In summary, 3D-Barolo is a powerful and robust tool to separate physical and instrumental effects and to derive a reliable kinematics. The analysis of large samples of galaxies at different redshifts with 3D-Barolo will provide new insights on how galaxies assemble and evolve throughout cosmic time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With this dissertation research we investigate intersections between design and marketing and in this respect, which factors do contribute that a product design becomes brand formative. We have developed a Brand Formative Design (BFD) framework, which investigates individual design features in a holistic, comparable, brand relevant, and consumer specific context. We discuss what kinds of characteristics contribute to BFD but also illuminate how they should be applied and examine: rnA holistic framework leading to Brand Formative Design. Identification and assessment of BFD Drivers. The dissection of products into three Distinctive Design Levels. The detection of surprising design preferences. The appropriate degree of scheme deviation with evolutionary design. Simulated BFD development processes with three different products and the integration of consumers. Future oriented objectification, comparability and assessment of design. Recommendations for the management of design in a brand specific context. Design is a product feature, which contributes significantly to the success of products. However, the development of new design contains challenges. Design can hardly be objectified; many people have an opinion concerning the attractiveness of new products but cannot formulate their future preferences. Product design is widely developed based on intuition, which can be difficult for the management of design. Here the concept of Brand Formative Design can provide a framework which contributes to structure, objectify, develop and assess new evolutionary design in brand and future relevant contexts, but also integrates consumers and their preferences without restricting creativity too much.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative sensory tests are widely used in human research to evaluate the effect of analgesics and explore altered pain mechanisms, such as central sensitization. In order to apply these tests in clinical practice, knowledge of reference values is essential. The aim of this study was to determine the reference values of pain thresholds for mechanical and thermal stimuli, as well as withdrawal time for the cold pressor test in 300 pain-free subjects. Pain detection and pain tolerance thresholds to pressure, heat and cold were determined at three body sites: (1) lower back, (2) suprascapular region and (3) second toe (for pressure) or the lateral aspect of the leg (for heat and cold). The influences of gender, age, height, weight, body-mass index (BMI), body side of testing, depression, anxiety, catastrophizing and parameters of Short-Form 36 (SF-36) were analyzed by multiple regressions. Quantile regressions were performed to define the 5th, 10th and 25th percentiles as reference values for pain hypersensitivity and the 75th, 90th and 95th percentiles as reference values for pain hyposensitivity. Gender, age and/or the interaction of age with gender were the only variables that consistently affected the pain measures. Women were more pain sensitive than men. However, the influence of gender decreased with increasing age. In conclusion, normative values of parameters related to pressure, heat and cold pain stimuli were determined. Reference values have to be stratified by body region, gender and age. The determination of these reference values will now allow the clinical application of the tests for detecting abnormal pain reactions in individual patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bite mark analysis offers the opportunity to identify the biter based on the individual characteristics of the dentitions. Normally, the main focus is on analysing bite mark injuries on human bodies, but also, bite marks in food may play an important role in the forensic investigation of a crime. This study presents a comparison of simulated bite marks in different kinds of food with the dentitions of the presumed biter. Bite marks were produced by six adults in slices of buttered bread, apples, different kinds of Swiss chocolate and Swiss cheese. The time-lapse influence of the bite mark in food, under room temperature conditions, was also examined. For the documentation of the bite marks and the dentitions of the biters, 3D optical surface scanning technology was used. The comparison was performed using two different software packages: the ATOS modelling and analysing software and the 3D studio max animation software. The ATOS software enables an automatic computation of the deviation between the two meshes. In the present study, the bite marks and the dentitions were compared, as well as the meshes of each bite mark which were recorded in the different stages of time lapse. In the 3D studio max software, the act of biting was animated to compare the dentitions with the bite mark. The examined food recorded the individual characteristics of the dentitions very well. In all cases, the biter could be identified, and the dentitions of the other presumed biters could be excluded. The influence of the time lapse on the food depends on the kind of food and is shown on the diagrams. However, the identification of the biter could still be performed after a period of time, based on the recorded individual characteristics of the dentitions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The accurate co-alignment of the transmitter to the receiver of the BepiColombo Laser Altimeter is a challenging task for which an original alignment concept had to be developed. We present here the design, construction and testing of a large collimator facility built to fulfill the tight alignment requirements. We describe in detail the solution found to attenuate the high energy of the instrument laser transmitter by an original beam splitting pentaprism group. We list the different steps of the calibration of the alignment facility and estimate the errors made at each of these steps. We finally prove that the current facility is ready for the alignment of the flight instrument. Its angular accuracy is 23 μrad.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The three-dimensional documentation of footwear and tyre impressions in snow offers an opportunity to capture additional fine detail for the identification as present photographs. For this approach, up to now, different casting methods have been used. Casting of footwear impressions in snow has always been a difficult assignment. This work demonstrates that for the three-dimensional documentation of impressions in snow the non-destructive method of 3D optical surface scanning is suitable. The new method delivers more detailed results of higher accuracy than the conventional casting techniques. The results of this easy to use and mobile 3D optical surface scanner were very satisfactory in different meteorological and snow conditions. The method is also suitable for impressions in soil, sand or other materials. In addition to the side by side comparison, the automatic comparison of the 3D models and the computation of deviations and accuracy of the data simplify the examination and delivers objective and secure results. The results can be visualized efficiently. Data exchange between investigating authorities at a national or an international level can be achieved easily with electronic data carriers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent article in this journal (Ioannidis JP (2005) Why most published research findings are false. PLoS Med 2: e124) argued that more than half of published research findings in the medical literature are false. In this commentary, we examine the structure of that argument, and show that it has three basic components: 1)An assumption that the prior probability of most hypotheses explored in medical research is below 50%. 2)Dichotomization of P-values at the 0.05 level and introduction of a “bias” factor (produced by significance-seeking), the combination of which severely weakens the evidence provided by every design. 3)Use of Bayes theorem to show that, in the face of weak evidence, hypotheses with low prior probabilities cannot have posterior probabilities over 50%. Thus, the claim is based on a priori assumptions that most tested hypotheses are likely to be false, and then the inferential model used makes it impossible for evidence from any study to overcome this handicap. We focus largely on step (2), explaining how the combination of dichotomization and “bias” dilutes experimental evidence, and showing how this dilution leads inevitably to the stated conclusion. We also demonstrate a fallacy in another important component of the argument –that papers in “hot” fields are more likely to produce false findings. We agree with the paper’s conclusions and recommendations that many medical research findings are less definitive than readers suspect, that P-values are widely misinterpreted, that bias of various forms is widespread, that multiple approaches are needed to prevent the literature from being systematically biased and the need for more data on the prevalence of false claims. But calculating the unreliability of the medical research literature, in whole or in part, requires more empirical evidence and different inferential models than were used. The claim that “most research findings are false for most research designs and for most fields” must be considered as yet unproven.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the use of model-based geostatistics for choosing the optimal set of sampling locations, collectively called the design, for a geostatistical analysis. Two types of design situations are considered. These are retrospective design, which concerns the addition of sampling locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing optimal positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model parameter values are unknown. The results show that in this situation a wide range of inter-point distances should be included in the design, and the widely used regular design is therefore not the optimal choice.