936 resultados para Target Field Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded siloxane polymer waveguides have shown promising results for use in optical backplanes. They exhibit high temperature stability, low optical absorption, and require common processing techniques. A challenging aspect of this technology is out-of-plane coupling of the waveguides. A multi-software approach to modeling an optical vertical interconnect (via) is proposed. This approach utilizes the beam propagation method to generate varied modal field distribution structures which are then propagated through a via model using the angular spectrum propagation technique. Simulation results show average losses between 2.5 and 4.5 dB for different initial input conditions. Certain configurations show losses of less than 3 dB and it is shown that in an input/output pair of vias, average losses per via may be lower than the targeted 3 dB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pacaya volcanic complex is part of the Central American volcanic arc, which is associated with the subduction of the Cocos tectonic plate under the Caribbean plate. Located 30 km south of Guatemala City, Pacaya is situated on the southern rim of the Amatitlan Caldera. It is the largest post-caldera volcano, and has been one of Central America’s most active volcanoes over the last 500 years. Between 400 and 2000 years B.P, the Pacaya volcano had experienced a huge collapse, which resulted in the formation of horseshoe-shaped scarp that is still visible. In the recent years, several smaller collapses have been associated with the activity of the volcano (in 1961 and 2010) affecting its northwestern flanks, which are likely to be induced by the local and regional stress changes. The similar orientation of dry and volcanic fissures and the distribution of new vents would likely explain the reactivation of the pre-existing stress configuration responsible for the old-collapse. This paper presents the first stability analysis of the Pacaya volcanic flank. The inputs for the geological and geotechnical models were defined based on the stratigraphical, lithological, structural data, and material properties obtained from field survey and lab tests. According to the mechanical characteristics, three lithotechnical units were defined: Lava, Lava-Breccia and Breccia-Lava. The Hoek and Brown’s failure criterion was applied for each lithotechnical unit and the rock mass friction angle, apparent cohesion, and strength and deformation characteristics were computed in a specified stress range. Further, the stability of the volcano was evaluated by two-dimensional analysis performed by Limit Equilibrium (LEM, ROCSCIENCE) and Finite Element Method (FEM, PHASE 2 7.0). The stability analysis mainly focused on the modern Pacaya volcano built inside the collapse amphitheatre of “Old Pacaya”. The volcanic instability was assessed based on the variability of safety factor using deterministic, sensitivity, and probabilistic analysis considering the gravitational instability and the effects of external forces such as magma pressure and seismicity as potential triggering mechanisms of lateral collapse. The preliminary results from the analysis provide two insights: first, the least stable sector is on the south-western flank of the volcano; second, the lowest safety factor value suggests that the edifice is stable under gravity alone, and the external triggering mechanism can represent a likely destabilizing factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determining how an exhaust system will perform acoustically before a prototype muffler is built can save the designer both a substantial amount of time and resources. In order to effectively use the simulation tools available it is important to understand what is the most effective tool for the intended purpose of analysis as well as how typical elements in an exhaust system affect muffler performance. An in-depth look at the available tools and their most beneficial uses are presented in this thesis. A full parametric study was conducted using the FEM method for typical muffler elements which was also correlated to experimental results. This thesis lays out the overall ground work on how to accurately predict sound pressure levels in the free field for an exhaust system with the engine properties included. The accuracy of the model is heavily dependent on the correct temperature profile of the model in addition to the accuracy of the source properties. These factors will be discussed in detail and methods for determining them will be presented. The secondary effects of mean flow, which affects both the acoustical wave propagation and the flow noise generation, will be discussed. Effective ways for predicting these secondary effects will be described. Experimental models will be tested on a flow rig that showcases these phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An invisibility cloak is a device that can hide the target by enclosing it from the incident radiation. This intriguing device has attracted a lot of attention since it was first implemented at a microwave frequency in 2006. However, the problems of existing cloak designs prevent them from being widely applied in practice. In this dissertation, we try to remove or alleviate the three constraints for practical applications imposed by loosy cloaking media, high implementation complexity, and small size of hidden objects compared to the incident wavelength. To facilitate cloaking design and experimental characterization, several devices and relevant techniques for measuring the complex permittivity of dielectric materials at microwave frequencies are developed. In particular, a unique parallel plate waveguide chamber has been set up to automatically map the electromagnetic (EM) field distribution for wave propagation through the resonator arrays and cloaking structures. The total scattering cross section of the cloaking structures was derived based on the measured scattering field by using this apparatus. To overcome the adverse effects of lossy cloaking media, microwave cloaks composed of identical dielectric resonators made of low loss ceramic materials are designed and implemented. The effective permeability dispersion was provided by tailoring dielectric resonator filling fractions. The cloak performances had been verified by full-wave simulation of true multi-resonator structures and experimental measurements of the fabricated prototypes. With the aim to reduce the implementation complexity caused by metamaterials employment for cloaking, we proposed to design 2-D cylindrical cloaks and 3-D spherical cloaks by using multi-layer ordinary dielectric material (εr>1) coating. Genetic algorithm was employed to optimize the dielectric profiles of the cloaking shells to provide the minimum scattering cross sections of the cloaked targets. The designed cloaks can be easily scaled to various operating frequencies. The simulation results show that the multi-layer cylindrical cloak essentially outperforms the similarly sized metamaterials-based cloak designed by using the transformation optics-based reduced parameters. For the designed spherical cloak, the simulated scattering pattern shows that the total scattering cross section is greatly reduced. In addition, the scattering in specific directions could be significantly reduced. It is shown that the cloaking efficiency for larger targets could be improved by employing lossy materials in the shell. At last, we propose to hide a target inside a waveguide structure filled with only epsilon near zero materials, which are easy to implement in practice. The cloaking efficiency of this method, which was found to increase for large targets, has been confirmed both theoretically and by simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to investigate the effect of single-pulse transcranial magnetic stimulation on the triggering of saccades. The right frontal eye field was stimulated during modified gap and overlap paradigms with flashed presentation of the lateral visual target of 80 ms. In order to examine possible facilitating or inhibitory effects on saccade triggering, three different time intervals of stimulation were chosen, i.e. simultaneously with onset of the target, during the presentation and after target end. Stimulation applied simultaneously with target onset significantly decreased the latency of contralateral saccades in the gap but not in the overlap paradigm. Stimulation after target end significantly increased saccade latency for both sides in the gap paradigm and for the contralateral side in the overlap paradigm. Stimulation during presentation had no effect in either paradigm. The results show that, depending on the time interval and the paradigm tested, a facilitation or inhibition of saccade triggering can be achieved. The results are discussed in a context of two probable transcranial magnetic stimulation effects, a direct interference with the frontal eye field on the one hand and a remote interference with the superior colliculus on the other hand.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All optical systems that operate in or through the atmosphere suffer from turbulence induced image blur. Both military and civilian surveillance, gun-sighting, and target identification systems are interested in terrestrial imaging over very long horizontal paths, but atmospheric turbulence can blur the resulting images beyond usefulness. My dissertation explores the performance of a multi-frame-blind-deconvolution technique applied under anisoplanatic conditions for both Gaussian and Poisson noise model assumptions. The technique is evaluated for use in reconstructing images of scenes corrupted by turbulence in long horizontal-path imaging scenarios and compared to other speckle imaging techniques. Performance is evaluated via the reconstruction of a common object from three sets of simulated turbulence degraded imagery representing low, moderate and severe turbulence conditions. Each set consisted of 1000 simulated, turbulence degraded images. The MSE performance of the estimator is evaluated as a function of the number of images, and the number of Zernike polynomial terms used to characterize the point spread function. I will compare the mean-square-error (MSE) performance of speckle imaging methods and a maximum-likelihood, multi-frame blind deconvolution (MFBD) method applied to long-path horizontal imaging scenarios. Both methods are used to reconstruct a scene from simulated imagery featuring anisoplanatic turbulence induced aberrations. This comparison is performed over three sets of 1000 simulated images each for low, moderate and severe turbulence-induced image degradation. The comparison shows that speckle-imaging techniques reduce the MSE 46 percent, 42 percent and 47 percent on average for low, moderate, and severe cases, respectively using 15 input frames under daytime conditions and moderate frame rates. Similarly, the MFBD method provides, 40 percent, 29 percent, and 36 percent improvements in MSE on average under the same conditions. The comparison is repeated under low light conditions (less than 100 photons per pixel) where improvements of 39 percent, 29 percent and 27 percent are available using speckle imaging methods and 25 input frames and 38 percent, 34 percent and 33 percent respectively for the MFBD method and 150 input frames. The MFBD estimator is applied to three sets of field data and the results presented. Finally, a combined Bispectrum-MFBD Hybrid estimator is proposed and investigated. This technique consistently provides a lower MSE and smaller variance in the estimate under all three simulated turbulence conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a comparison of principal component (PC) regression and regularized expectation maximization (RegEM) to reconstruct European summer and winter surface air temperature over the past millennium. Reconstruction is performed within a surrogate climate using the National Center for Atmospheric Research (NCAR) Climate System Model (CSM) 1.4 and the climate model ECHO-G 4, assuming different white and red noise scenarios to define the distortion of pseudoproxy series. We show how sensitivity tests lead to valuable “a priori” information that provides a basis for improving real world proxy reconstructions. Our results emphasize the need to carefully test and evaluate reconstruction techniques with respect to the temporal resolution and the spatial scale they are applied to. Furthermore, we demonstrate that uncertainties inherent to the predictand and predictor data have to be more rigorously taken into account. The comparison of the two statistical techniques, in the specific experimental setting presented here, indicates that more skilful results are achieved with RegEM as low frequency variability is better preserved. We further detect seasonal differences in reconstruction skill for the continental scale, as e.g. the target temperature average is more adequately reconstructed for summer than for winter. For the specific predictor network given in this paper, both techniques underestimate the target temperature variations to an increasing extent as more noise is added to the signal, albeit RegEM less than with PC regression. We conclude that climate field reconstruction techniques can be improved and need to be further optimized in future applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multilocus sequence typing (MLST) scheme was established and evaluated for Mycoplasma hyopneumoniae, the etiologic agent of enzootic pneumonia in swine with the aim of defining strains. Putative target genes were selected by genome sequence comparisons. Out of 12 housekeeping genes chosen and experimentally validated, the 7 genes efp, metG, pgiB, recA, adk, rpoB, and tpiA were finally used to establish the MLST scheme. Their usefulness was assessed individually and in combination using a set of well-defined field samples and strains of M. hyopneumoniae. A reduction to the three targets showing highest variation (adk, rpoB, and tpiA) was possible resulting in the same number of sequence types as using the seven targets. The established MLST approach was compared with the recently described typing method using the serine-rich repeat motif-encoding region of the p146 gene. There was coherence between the two methods, but MLST resulted in a slightly higher resolution. Farms recognized to be affected by enzootic pneumonia were always associated with a single M. hyopneumoniae clone, which in most cases differed from farm to farm. However, farms in close geographic or operational contact showed identical clones as defined by MLST typing. Population analysis showed that recombination in M. hyopneumoniae occurs and that strains are very diverse with only limited clonality observed. Elaborate classical MLST schemes using multiple targets for M. hyopneumoniae might therefore be of limited value. In contrast, MLST typing of M. hyopneumoniae using the three genes adk, rpoB, and tpiA seems to be sufficient for epidemiological investigations by direct amplification of target genes from lysate of clinical material without prior cultivation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Target difficulty is often argued to increase performance. While this association is well established in experimental research, empirical evidence in field research is rather mixed. We attempt to explain this inconsistency by analyzing the importance of intra-year target revisions, which are especially prevalent in real-world field settings. Using survey and archival data from 97 firms, we find that firms with more challenging business unit targets revise targets more often, in line with asymmetric, downward target revisions. Results further show that the degree to which targets are revised during a period results in negative effects on firm performance, as the anticipation of revision negatively affects the business unit management’s performance incentives. Additionally, we find that using targets predominantly for either decision-making or control influences the overall performance effects of target revisions. Our findings may partially explain the mixed field study evidence regarding the effects of target difficulty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of the immediate prestimulus EEG microstate (sub-second epoch of stable topography/map landscape) on the map landscape of visually evoked 47-channel event-related potential (ERP) microstates was examined using the frequent, non-target stimuli of a cognitive paradigm (12 volunteers). For the two most frequent prestimulus microstate classes (oriented left anterior-right posterior and right anterior-left posterior), ERP map series were selectively averaged. The post-stimulus ERP grand average map series was segmented into microstates; 10 were found. The centroid locations of positive and negative map areas were extracted as landscape descriptors. Significant differences (MANOVAs and t-tests) between the two prestimulus classes were found in four of the ten ERP microstates. The relative orientation of the two ERP microstate classes was the same as prestimulus in some ERP microstates, but reversed in others. — Thus, brain electric microstates at stimulus arrival influence the landscapes of the post-stimulus ERP maps and therefore, information processing; prestimulus microstate effects differed for different post-stimulus ERP microstates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to examine the effects of aging and target eccentricity on a visual search task comprising 30 images of everyday life projected into a hemisphere, realizing a ±90° visual field. The task performed binocularly allowed participants to freely move their eyes to scan images for an appearing target or distractor stimulus (presented at 10°; 30°, and 50° eccentricity). The distractor stimulus required no response, while the target stimulus required acknowledgment by pressing the response button. One hundred and seventeen healthy subjects (mean age = 49.63 years, SD = 17.40 years, age range 20–78 years) were studied. The results show that target detection performance decreases with age as well as with increasing eccentricity, especially for older subjects. Reaction time also increases with age and eccentricity, but in contrast to target detection, there is no interaction between age and eccentricity. Eye movement analysis showed that younger subjects exhibited a passive search strategy while older subjects exhibited an active search strategy probably as a compensation for their reduced peripheral detection performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying and comparing different steady states is an important task for clinical decision making. Data from unequal sources, comprising diverse patient status information, have to be interpreted. In order to compare results an expressive representation is the key. In this contribution we suggest a criterion to calculate a context-sensitive value based on variance analysis and discuss its advantages and limitations referring to a clinical data example obtained during anesthesia. Different drug plasma target levels of the anesthetic propofol were preset to reach and maintain clinically desirable steady state conditions with target controlled infusion (TCI). At the same time systolic blood pressure was monitored, depth of anesthesia was recorded using the bispectral index (BIS) and propofol plasma concentrations were determined in venous blood samples. The presented analysis of variance (ANOVA) is used to quantify how accurately steady states can be monitored and compared using the three methods of measurement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T2.33TC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Identification of the ventrointermediate thalamic nucleus (Vim) in modern 3T high-field MRI for image-based targeting in deep brain stimulation (DBS) is still challenging. To evaluate the usefulness and reliability of analyzing the connectivity with the cerebellum using Q-ball-calculation we performed a retrospective analysis. Method: 5 patients who underwent bilateral implantation of electrodes in the Vim for treatment of Essential Tremor between 2011 and 2012 received additional preoperative Q-ball imaging. Targeting was performed according to atlas coordinates and standard MRI. Additionally we performed a retrospective identification of the Vim by analyzing the connectivity of the thalamus with the dentate nucleus. The exact position of the active stimulation contact in the postoperative CT was correlated with the Vim as it was identified by Q-ball calculation. Results: Localization of the Vim by analysis of the connectivity between thalamus and cerebellum was successful in all 5 patients on both sides. The average position of the active contacts was 14.6 mm (SD 1.24) lateral, 5.37 mm (SD 0.094 posterior and 2.21 mm (SD 0.69) cranial of MC. The cranial portion of the dentato-rubro-thalamic tract was localized an average of 3.38 mm (SD 1.57) lateral and 1.5 mm (SD 1.22) posterior of the active contact. Conclusions: Connectivity analysis by Q-ball calculation provided direct visualization of the Vim in all cases. Our preliminary results suggest, that the target determined by connectivity analysis is valid and could possibly be used in addition to or even instead of atlas based targeting. Larger prospective calculations are needed to determine the robustness of this method in providing refined information useful for neurosurgical treatment of tremor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last century, several mathematical models have been developed to calculate blood ethanol concentrations (BAC) from the amount of ingested ethanol and vice versa. The most common one in the field of forensic sciences is Widmark's equation. A drinking experiment with 10 voluntary test persons was performed with a target BAC of 1.2 g/kg estimated using Widmark's equation as well as Watson's factor. The ethanol concentrations in the blood were measured using headspace gas chromatography/flame ionization and additionally with an alcohol Dehydrogenase (ADH)-based method. In a healthy 75-year-old man a distinct discrepancy between the intended and the determined blood ethanol concentration was observed. A blood ethanol concentration of 1.83 g/kg was measured and the man showed signs of intoxication. A possible explanation for the discrepancy is a reduction of the total body water content in older people. The incident showed that caution is advised when using the different mathematical models in aged people. When estimating ethanol concentrations, caution is recommended with calculated results due to potential discrepancies between mathematical models and biological systems