947 resultados para Augmented-wave Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite Difference Time Domain (FDTD) Method and software are applied to obtain diffraction waves from modulated Gaussian plane wave illumination for right angle wedges and Fast Fourier Transform (FFT) is used to get diffraction coefficients in a wideband in the illuminated lit region. Theta and Phi polarization in 3-dimensional, TM and TE polarization in 2-dimensional cases are considered respectively for soft and hard diffraction coefficients. Results using FDTD method of perfect electric conductor (PEC) wedge are compared with asymptotic expressions from Uniform Theory of Diffraction (UTD). Extend the PEC wedges to some homogenous conducting and dielectric building materials for diffraction coefficients that are not available analytically in practical conditions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electromagnetic waves in suburban environment encounter multiple obstructions that shadow the signal. These waves are scattered and random in polarization. They take multiple paths that add as vectors at the portable device. Buildings have vertical and horizontal edges. Diffraction from edges has polarization dependent characteristics. In practical case, a signal transmitted from a vertically polarized high antenna will result in a significant fraction of total power in the horizontal polarization at the street level. Signal reception can be improved whenever there is a probability of receiving the signal in at least two independent ways or branches. The Finite-Difference Time-Domain (FDTD) method was applied to obtain the two and three-dimensional dyadic diffraction coefficients (soft and hard) of right-angle perfect electric conductor (PEC) wedges illuminated by a plane wave. The FDTD results were in good agreement with the asymptotic solutions obtained using Uniform Theory of Diffraction (UTD). Further, a material wedge replaced the PEC wedge and the dyadic diffraction coefficient for the same was obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shadows and illumination play an important role when generating a realistic scene in computer graphics. Most of the Augmented Reality (AR) systems track markers placed in a real scene and retrieve their position and orientation to serve as a frame of reference for added computer generated content, thereby producing an augmented scene. Realistic depiction of augmented content with coherent visual cues is a desired goal in many AR applications. However, rendering an augmented scene with realistic illumination is a complex task. Many existent approaches rely on a non automated pre-processing phase to retrieve illumination parameters from the scene. Other techniques rely on specific markers that contain light probes to perform environment lighting estimation. This study aims at designing a method to create AR applications with coherent illumination and shadows, using a textured cuboid marker, that does not require a training phase to provide lighting information. Such marker may be easily found in common environments: most of product packaging satisfies such characteristics. Thus, we propose a way to estimate a directional light configuration using multiple texture tracking to render AR scenes in a realistic fashion. We also propose a novel feature descriptor that is used to perform multiple texture tracking. Our descriptor is an extension of the binary descriptor, named discrete descriptor, and outperforms current state-of-the-art methods in speed, while maintaining their accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents and discusses the results of ambient seismic noise correlation for two different environments: intraplate and Mid-Atlantic Ridge. The coda wave interferometry method has also been tested for the intraplate data. Ambient noise correlation is a method that allows to retrieve the structural response between two receivers from ambient noise records, as if one of the station was a virtual source. It has been largely used in seismology to image the subsurface and to monitor structural changes associated mostly with volcanic eruptions and large earthquakes. In the intraplate study, we were able to detect localized structural changes related to a small earthquake swarm, which main event is mR 3.7, North-East of Brazil. We also showed that the 1-bit normalization and spectral whitening result on the loss of waveform details and that the phase auto-correlation, which is amplitude unbiased, seems to be more sensitive and robust for our analysis of a small earthquake swarm. The analysis of 6 months of data using cross-correlations detect clear medium changes soon after the main event while the auto-correlations detect changes essentially after 1 month. It could be explained by fluid pressure redistribution which can be initiated by hydromechanical changes and opened path ways to shallower depth levels due to later occurring earthquakes. In the Mid-Atlantic Ridge study, we investigate structural changes associated with a mb 4.9 earthquake in the region of the Saint Paul transform fault. The data have been recorded by a single broadband seismic station located at less than 200 km from the Mid-Atlantic ridge. The results of the phase auto-correlation for a 5-month period, show a strong co-seismic medium change followed by a relatively fast post-seismic recovery. This medium change is likely related to the damages caused by the earthquake’s ground shaking. The healing process (filling of the new cracks) that lasted 60 days can be decomposed in two phases, a fast recovery (70% in ~30 days) in the early post-seismic stage and a relatively slow recovery later (30% in ~30 days). In the coda wave interferometry study, we monitor temporal changes of the subsurface caused by the small intraplate earthquake swarm mentioned previously. The method was first validated with synthetics data. We were able to detect a change of 2.5% in the source position and a 15% decrease of the scatterers’ amount. Then, from the real data, we observed a rapid decorrelation of the seismic coda after the mR 3.7 seismic event. This indicates a rapid change of the subsurface in the fault’s region induced by the earthquake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we developed and improved the numerical mode matching (NMM) method which has previously been shown to be a fast and robust semi-analytical solver to investigate the propagation of electromagnetic (EM) waves in an isotropic layered medium. The applicable models, such as cylindrical waveguide, optical fiber, and borehole with earth geological formation, are generally modeled as an axisymmetric structure which is an orthogonal-plano-cylindrically layered (OPCL) medium consisting of materials stratified planarly and layered concentrically in the orthogonal directions.

In this report, several important improvements have been made to extend applications of this efficient solver to the anisotropic OCPL medium. The formulas for anisotropic media with three different diagonal elements in the cylindrical coordinate system are deduced to expand its application to more general materials. The perfectly matched layer (PML) is incorporated along the radial direction as an absorbing boundary condition (ABC) to make the NMM method more accurate and efficient for wave diffusion problems in unbounded media and applicable to scattering problems with lossless media. We manipulate the weak form of Maxwell's equations and impose the correct boundary conditions at the cylindrical axis to solve the singularity problem which is ignored by all previous researchers. The spectral element method (SEM) is introduced to more efficiently compute the eigenmodes of higher accuracy with less unknowns, achieving a faster mode matching procedure between different horizontal layers. We also prove the relationship of the field between opposite mode indices for different types of excitations, which can reduce the computational time by half. The formulas for computing EM fields excited by an electric or magnetic dipole located at any position with an arbitrary orientation are deduced. And the excitation are generalized to line and surface current sources which can extend the application of NMM to the simulations of controlled source electromagnetic techniques. Numerical simulations have demonstrated the efficiency and accuracy of this method.

Finally, the improved numerical mode matching (NMM) method is introduced to efficiently compute the electromagnetic response of the induction tool from orthogonal transverse hydraulic fractures in open or cased boreholes in hydrocarbon exploration. The hydraulic fracture is modeled as a slim circular disk which is symmetric with respect to the borehole axis and filled with electrically conductive or magnetic proppant. The NMM solver is first validated by comparing the normalized secondary field with experimental measurements and a commercial software. Then we analyze quantitatively the induction response sensitivity of the fracture with different parameters, such as length, conductivity and permeability of the filled proppant, to evaluate the effectiveness of the induction logging tool for fracture detection and mapping. Casings with different thicknesses, conductivities and permeabilities are modeled together with the fractures in boreholes to investigate their effects for fracture detection. It reveals that the normalized secondary field will not be weakened at low frequencies, ensuring the induction tool is still applicable for fracture detection, though the attenuation of electromagnetic field through the casing is significant. A hybrid approach combining the NMM method and BCGS-FFT solver based integral equation has been proposed to efficiently simulate the open or cased borehole with tilted fractures which is a non-axisymmetric model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents an investigation to the thermodynamics of the air flow in the air chamber for the oscillating water column wave energy converters, in which the oscillating water surface in the water column pressurizes or de-pressurises the air in the chamber. To study the thermodynamics and the compressibility of the air in the chamber, a method is developed in this research: the power take-off is replaced with an accepted semi-empirical relationship between the air flow rate and the oscillating water column chamber pressure, and the thermodynamic process is simplified as an isentropic process. This facilitates the use of a direct expression for the work done on the power take-off by the flowing air and the generation of a single differential equation that defines the thermodynamic process occurring inside the air chamber. Solving the differential equation, the chamber pressure can be obtained if the interior water surface motion is known or the chamber volume (thus the interior water surface motion) if the chamber pressure is known. As a result, the effects of the air compressibility can be studied. Examples given in the paper have shown the compressibility, and its effects on the power losses for large oscillating water column devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a study on the numerical simulation of the primary wave energy conversion in the oscillating water column (OWC) wave energy converters (WECs). The new proposed numerical approach consists of three major components: potential flow analysis for the conventional hydrodynamic parameters, such as added mass, damping coefficients, restoring force coefficients and wave excitations; the thermodynamic analysis of the air in the air chamber, which is under the assumptions of the given power take-off characteristics and an isentropic process of air flow. In the formulation, the air compressibility and its effects have been included; and a time-domain analysis by combining the linear potential flow and the thermodynamics of the air flow in the chamber, in which the hydrodynamics and thermodynamics/aerodynamics have been coupled together by the force generated by the pressurised and de-pressurised air in the air chamber, which in turn has effects on the motions of the structure and the internal water surface. As an example, the new developed approach has been applied to a fixed OWC device. The comparisons of the measured data and the simulation results show the new method is very capable of predicting the performance of the OWC devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surveys can collect important data that inform policy decisions and drive social science research. Large government surveys collect information from the U.S. population on a wide range of topics, including demographics, education, employment, and lifestyle. Analysis of survey data presents unique challenges. In particular, one needs to account for missing data, for complex sampling designs, and for measurement error. Conceptually, a survey organization could spend lots of resources getting high-quality responses from a simple random sample, resulting in survey data that are easy to analyze. However, this scenario often is not realistic. To address these practical issues, survey organizations can leverage the information available from other sources of data. For example, in longitudinal studies that suffer from attrition, they can use the information from refreshment samples to correct for potential attrition bias. They can use information from known marginal distributions or survey design to improve inferences. They can use information from gold standard sources to correct for measurement error.

This thesis presents novel approaches to combining information from multiple sources that address the three problems described above.

The first method addresses nonignorable unit nonresponse and attrition in a panel survey with a refreshment sample. Panel surveys typically suffer from attrition, which can lead to biased inference when basing analysis only on cases that complete all waves of the panel. Unfortunately, the panel data alone cannot inform the extent of the bias due to attrition, so analysts must make strong and untestable assumptions about the missing data mechanism. Many panel studies also include refreshment samples, which are data collected from a random sample of new

individuals during some later wave of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by nonignorable attrition while reducing reliance on strong assumptions about the attrition process. To date, these bias correction methods have not dealt with two key practical issues in panel studies: unit nonresponse in the initial wave of the panel and in the

refreshment sample itself. As we illustrate, nonignorable unit nonresponse

can significantly compromise the analyst's ability to use the refreshment samples for attrition bias correction. Thus, it is crucial for analysts to assess how sensitive their inferences---corrected for panel attrition---are to different assumptions about the nature of the unit nonresponse. We present an approach that facilitates such sensitivity analyses, both for suspected nonignorable unit nonresponse

in the initial wave and in the refreshment sample. We illustrate the approach using simulation studies and an analysis of data from the 2007-2008 Associated Press/Yahoo News election panel study.

The second method incorporates informative prior beliefs about

marginal probabilities into Bayesian latent class models for categorical data.

The basic idea is to append synthetic observations to the original data such that

(i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data.

We illustrate the approach using a variety of simulations based on data from the American Community Survey, including an example of how augmented records can be used to fit latent class models to data from stratified samples.

The third method leverages the information from a gold standard survey to model reporting error. Survey data are subject to reporting error when respondents misunderstand the question or accidentally select the wrong response. Sometimes survey respondents knowingly select the wrong response, for example, by reporting a higher level of education than they actually have attained. We present an approach that allows an analyst to model reporting error by incorporating information from a gold standard survey. The analyst can specify various reporting error models and assess how sensitive their conclusions are to different assumptions about the reporting error process. We illustrate the approach using simulations based on data from the 1993 National Survey of College Graduates. We use the method to impute error-corrected educational attainments in the 2010 American Community Survey using the 2010 National Survey of College Graduates as the gold standard survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability has emerged as a critical design constraint especially in memories. Designers are going to great lengths to guarantee fault free operation of the underlying silicon by adopting redundancy-based techniques, which essentially try to detect and correct every single error. However, such techniques come at a cost of large area, power and performance overheads which making many researchers to doubt their efficiency especially for error resilient systems where 100% accuracy is not always required. In this paper, we present an alternative method focusing on the confinement of the resulting output error induced by any reliability issues. By focusing on memory faults, rather than correcting every single error the proposed method exploits the statistical characteristics of any target application and replaces any erroneous data with the best available estimate of that data. To realize the proposed method a RISC processor is augmented with custom instructions and special-purpose functional units. We apply the method on the proposed enhanced processor by studying the statistical characteristics of the various algorithms involved in a popular multimedia application. Our experimental results show that in contrast to state-of-the-art fault tolerance approaches, we are able to reduce runtime and area overhead by 71.3% and 83.3% respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ultrasonic non-destructive testing of components may encounter considerable difficulties to interpret some inspections results mainly in anisotropic crystalline structures. A numerical method for the simulation of elastic wave propagation in homogeneous elastically anisotropic media, based on the general finite element approach, is used to help this interpretation. The successful modeling of elastic field associated with NDE is based on the generation of a realistic pulsed ultrasonic wave, which is launched from a piezoelectric transducer into the material under inspection. The values of elastic constants are great interest information that provide the application of equations analytical models, until small and medium complexity problems through programs of numerical analysis as finite elements and/or boundary elements. The aim of this work is the comparison between the results of numerical solution of an ultrasonic wave, which is obtained from transient excitation pulse that can be specified by either force or displacement variation across the aperture of the transducer, and the results obtained from a experiment that was realized in an aluminum block in the IEN Ultrasonic Laboratory. The wave propagation can be simulated using all the characteristics of the material used in the experiment evaluation associated to boundary conditions and from these results, the comparison can be made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, wind wave prediction and analysis in the Southern Caspian Sea are surveyed. Because of very much importance and application of this matter in reducing vital and financial damages or marine activities, such as monitoring marine pollution, designing marine structure, shipping, fishing, offshore industry, tourism and etc, gave attention by some marine activities. In this study are used the Caspian Sea topography data that are extracted from the Caspian Sea Hydrography map of Iran Armed Forces Geographical Organization and the I 0 meter wind field data that are extracted from the transmitted GTS synoptic data of regional centers to Forecasting Center of Iran Meteorological Organization for wave prediction and is used the 20012 wave are recorded by the oil company's buoy that was located at distance 28 Kilometers from Neka shore for wave analysis. The results of this research are as follows: - Because of disagreement between the prediction results of SMB method in the Caspian sea and wave data of the Anzali and Neka buoys. The SMB method isn't able to Predict wave characteristics in the Southern Caspian Sea. - Because of good relativity agreement between the WAM model output in the Caspian Sea and wave data of the Anzali buoy. The WAM model is able to predict wave characteristics in the southern Caspian Sea with high relativity accuracy. The extreme wave height distribution function for fitting to the Southern Caspian Sea wave data is obtained by determining free parameters of Poisson-Gumbel function through moment method. These parameters are as below: A=2.41, B=0.33. The maximum relative error between the estimated 4-year return value of the Southern Caspian Sea significant wave height by above function with the wave data of Neka buoy is about %35. The 100-year return value of the Southern Caspian Sea significant height wave is about 4.97 meter. The maximum relative error between the estimated 4-year return value of the Southern Caspian Sea significant wave height by statistical model of peak over threshold with the wave data of Neka buoy is about %2.28. The parametric relation for fitting to the Southern Caspian Sea frequency spectra is obtained by determining free parameters of the Strekalov, Massel and Krylov etal_ multipeak spectra through mathematical method. These parameters are as below: A = 2.9 B=26.26, C=0.0016 m=0.19 and n=3.69. The maximum relative error between calculated free parameters of the Southern Caspian Sea multipeak spectrum with the proposed free parameters of double-peaked spectrum by Massel and Strekalov on the experimental data from the Caspian Sea is about 36.1 % in spectrum energetic part and is about 74M% in spectrum high frequency part. The peak over threshold waverose of the Southern Caspian Sea shows that maximum occurrence probability of wave height is relevant to waves with 2-2.5 meters wave fhe error sources in the statistical analysis are mainly due to: l) the missing wave data in 2 years duration through battery discharge of Neka buoy. 2) the deportation %15 of significant height annual mean in single year than long period average value that is caused by lack of adequate measurement on oceanic waves, and the error sources in the spectral analysis are mainly due to above- mentioned items and low accurate of the proposed free parameters of double-peaked spectrum on the experimental data from the Caspian Sea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the analysis of wave and turbulence measurements collected at a tidal energy site. A new method is introduced to produce more consistent and rigorous estimations of the velocity fluctuations power spectral densities. An analytical function is further proposed to fit the observed spectra and could be input to the numerical models predicting power production and structural loading on tidal turbines. Another new approach is developed to correct for the effect of the Doppler noise on the high frequencies power spectral densities. The analysis of velocity time series combining wave and turbulent contributions demonstrates that the turbulent motions are coherent throughout the water column, rendering the wave coherence-based methods not applicable to our dataset. To avoid this problem, an alternative approach relying on the pressure data collected by the ADCP is introduced and shows appreciable improvement in the wave-turbulence separation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation research project uses the Euromaidan protests in Ukraine to inform and shape a theory of augmented dissent to help explain the complex ways in which protest participants guided by the political, social, and cultural contexts engage in dissent augmented by ICTs in a reality where both the physical and the digital are used in concert. The purpose of this research is to conceptualize the use and perception of ICTs in protest activity using the communicative affordances framework. Through a mixed-method research approach involving interviews with protest participants, as well as qualitative and thematic analysis of online content from social media pages of several key Euromaidan protest communities, the research project examines the role ICTs played in the information and media landscape during the Euromaidan protest. The findings of the online content analysis were used to inform the questions for the 59 semi-structured, open-ended interviews with Euromaidan protest participants in Ukraine and abroad. The research findings provide in-depth insights about how ICTs were used and perceived by protest participants, and their role as vehicles for information and civic media content. The study employs the theoretical framework of social media affordances to interpret the data gathered during the interviews and content analysis to better understand how digital media augmented citizens’ protest activity through affording them new possibilities for dissent, and how they made meaning of said protest activity as augmented by ICTs. The findings contribute towards shaping a theory of digitally augmented dissent that conceptualizes the complex relationship between citizens and ICTs during protest activity as an affordance-driven one, where online and offline tools and activity merge into a unified dissent space and extend or augment the possibilities for action in interesting, and sometimes unexpected ways. Such a conceptual model could inform broader theories about civic participation and digital activism in the post-Soviet world and beyond, as ICTs become an inseparable part of civic life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.