943 resultados para Additive Fertigung


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional speech enhancement methods optimise signal-level criteria such as signal-to-noise ratio, but these approaches are sub-optimal for noise-robust speech recognition. Likelihood-maximising (LIMA) frameworks are an alternative that optimise parameters of enhancement algorithms based on state sequences generated for utterances with known transcriptions. Previous reports of LIMA frameworks have shown significant promise for improving speech recognition accuracies under additive background noise for a range of speech enhancement techniques. In this paper we discuss the drawbacks of the LIMA approach when multiple layers of acoustic mismatch are present – namely background noise and speaker accent. Experimentation using LIMA-based Mel-filterbank noise subtraction on American and Australian English in-car speech databases supports this discussion, demonstrating that inferior speech recognition performance occurs when a second layer of mismatch is seen during evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Balimau Putih [an Indonesian cultivar tolerant to rice tungro bacilliform virus (RTBV)] was crossed with IR64 (RTBV, susceptible variety) to produce the three filial generations F1, F2 and F3. Agroinoculation was used to introduce RTBV into the test plants. RTBV tolerance was based on the RTBV level in plants by analysis of coat protein using enzyme-linked immunosorbent assay. The level of RTBV in cv. Balimau Putih was significantly lower than that of IR64 and the susceptible control, Taichung Native 1. Mean RTBV levels of the F1, F2 and F3 populations were comparable with one another and with the average of the parents. Results indicate that there was no dominance and an additive gene action may control the expression of tolerance to RTBV. Tolerance based on the level of RTBV coat protein was highly heritable (0.67) as estimated using the mean values of F3 lines, suggesting that selection for tolerance to RTBV can be performed in the early selfing generations using the technique employed in this study. The RTBV level had a negative correlation with plant height, but positive relationship with disease index value

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in safety research—trying to improve the collective understanding of motor vehicle crash causation—rests upon the pursuit of numerous lines of inquiry. The research community has focused on analytical methods development (negative binomial specifications, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might think of different lines of inquiry in terms of ‘low lying fruit’—areas of inquiry that might provide significant improvements in understanding crash causation. It is the contention of this research that omitted variable bias caused by the exclusion of important variables is an important line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant ability to better understand contributing factors to crashes. This study—believed to represent a unique contribution to the safety literature—develops and examines the role of a sizeable set of spatial variables in intersection crash occurrence. In addition to commonly considered traffic and geometric variables, examined spatial factors include local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools. The results indicate that inclusion of these factors results in significant improvement in model explanatory power, and the results also generally agree with expectation. The research illuminates the importance of spatial variables in safety research and also the negative consequences of their omissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Many studies have illustrated that ambient air pollution negatively impacts on health. However, little evidence is available for the effects of air pollution on cardiovascular mortality (CVM) in Tianjin, China. Also, no study has examined which strata length for the time-stratified case–crossover analysis gives estimates that most closely match the estimates from time series analysis. Objectives: The purpose of this study was to estimate the effects of air pollutants on CVM in Tianjin, China, and compare time-stratified case–crossover and time series analyses. Method: A time-stratified case–crossover and generalized additive model (time series) were applied to examine the impact of air pollution on CVM from 2005 to 2007. Four time-stratified case–crossover analyses were used by varying the stratum length (Calendar month, 28, 21 or 14 days). Jackknifing was used to compare the methods. Residual analysis was used to check whether the models fitted well. Results: Both case–crossover and time series analyses show that air pollutants (PM10, SO2 and NO2) were positively associated with CVM. The estimates from the time-stratified case–crossover varied greatly with changing strata length. The estimates from the time series analyses varied slightly with changing degrees of freedom per year for time. The residuals from the time series analyses had less autocorrelation than those from the case–crossover analyses indicating a better fit. Conclusion: Air pollution was associated with an increased risk of CVM in Tianjin, China. Time series analyses performed better than the time-stratified case–crossover analyses in terms of residual checking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: A number of studies have examined the relationship between high ambient temperature and mortality. Recently, concern has arisen about whether this relationship is modified by socio-demographic factors. However, data for this type of study is relatively scarce in subtropical/tropical regions where people are well accustomed to warm temperatures. Objective: To investigate whether the relationship between daily mean temperature and daily all-cause mortality is modified by age, gender and socio-economic status (SES) in Brisbane, Australia. Methods: We obtained daily mean temperature and all-cause mortality data for Brisbane, Australia during 1996–2004. A generalised additive model was fitted to assess the percentage increase in all deaths with every one degree increment above the threshold temperature. Different age, gender and SES groups were included in the model as categorical variables and their modification effects were estimated separately. Results: A total of 53,316 non-external deaths were included during the study period. There was a clear increasing trend in the harmful effect of high temperature on mortality with age. The effect estimate among women was more than 20 times that among men. We did not find an SES effect on the percent increase associated with temperature. Conclusions: The effects of high temperature on all deaths were modified by age and gender but not by SES in Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: There has been a significant increase in the availability of online programs for alcohol problems. A systematic review of the research evidence underpinning these programs is timely. Objectives: Our objective was to review the efficacy of online interventions for alcohol misuse. Systematic searches of Medline, PsycINFO, Web of Science, and Scopus were conducted for English abstracts (excluding dissertations) published from 1998 onward. Search terms were: (1) Internet, Web*; (2) online, computer*; (3) alcohol*; and (4) E\effect*, trial*, random* (where * denotes a wildcard). Forward and backward searches from identified papers were also conducted. Articles were included if (1) the primary intervention was delivered and accessed via the Internet, (2) the intervention focused on moderating or stopping alcohol consumption, and (3) the study was a randomized controlled trial of an alcohol-related screen, assessment, or intervention. Results: The literature search initially yielded 31 randomized controlled trials (RCTs), 17 of which met inclusion criteria. Of these 17 studies, 12 (70.6%) were conducted with university students, and 11 (64.7%) specifically focused on at-risk, heavy, or binge drinkers. Sample sizes ranged from 40 to 3216 (median 261), with 12 (70.6%) studies predominantly involving brief personalized feedback interventions. Using published data, effect sizes could be extracted from 8 of the 17 studies. In relation to alcohol units per week or month and based on 5 RCTs where a measure of alcohol units per week or month could be extracted, differential effect sizes to post treatment ranged from 0.02 to 0.81 (mean 0.42, median 0.54). Pre-post effect sizes for brief personalized feedback interventions ranged from 0.02 to 0.81, and in 2 multi-session modularized interventions, a pre-post effect size of 0.56 was obtained in both. Pre-post differential effect sizes for peak blood alcohol concentrations (BAC) ranged from 0.22 to 0.88, with a mean effect size of 0.66. Conclusions: The available evidence suggests that users can benefit from online alcohol interventions and that this approach could be particularly useful for groups less likely to access traditional alcohol-related services, such as women, young people, and at-risk users. However, caution should be exercised given the limited number of studies allowing extraction of effect sizes, the heterogeneity of outcome measures and follow-up periods, and the large proportion of student-based studies. More extensive RCTs in community samples are required to better understand the efficacy of specific online alcohol approaches, program dosage, the additive effect of telephone or face-to-face interventions, and effective strategies for their dissemination and marketing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To assess the efficacy of maternal betamethasone for improving preterm lung function, in the presence of inflammation induced by amniotic fluid ureaplasma colonization. ----- ----- Study design: Ewes bearing single fetuses were randomized to receive an intra-amniotic injection of Ureaplasma parvum (serovar 6; 2×107 colony forming units) or vehicle at 86±2 days of pregnancy (mean±SD: term is 150d), followed by maternal intramuscular betamethasone (0.5mg/kg) or saline, either 2 or 7 days before delivery of lambs at 123±1d. ----- ----- Results: Amniotic fluid IL-8 was elevated by ureaplasmas (p=0.049) but unaffected by betamethasone. Lung inflammation induced by ureaplasmas was not affected by betamethasone. Lung compliance was increased by ureaplasma colonization (p=0.009) and betamethasone (p=0.042), and effects were additive. Lung surfactant was increased by ureaplasma colonization (p<0.001) and betamethasone 7 days (p=0.001), but not 2 days, before delivery. ----- ----- Conclusion: Inflammation improves preterm lung function due to increases in surfactant. Antenatal corticosteroids further augment lung function, through an apparently independent mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review the literature on the combined effect of asbestos exposure and smoking on lung cancer, and explore a Bayesian approach to assess evidence of interaction. Previous approaches have focussed on separate tests for an additive or multiplicative relation. We extend these approaches by exploring the strength of evidence for either relation using approaches which allow the data to choose between both models. We then compare the different approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the behavior of the empirical minimization algorithm using various methods. We first analyze it by comparing the empirical, random, structure and the original one on the class, either in an additive sense, via the uniform law of large numbers, or in a multiplicative sense, using isomorphic coordinate projections. We then show that a direct analysis of the empirical minimization algorithm yields a significantly better bound, and that the estimates we obtain are essentially sharp. The method of proof we use is based on Talagrand’s concentration inequality for empirical processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We seek numerical methods for second‐order stochastic differential equations that reproduce the stationary density accurately for all values of damping. A complete analysis is possible for scalar linear second‐order equations (damped harmonic oscillators with additive noise), where the statistics are Gaussian and can be calculated exactly in the continuous‐time and discrete‐time cases. A matrix equation is given for the stationary variances and correlation for methods using one Gaussian random variable per timestep. The only Runge–Kutta method with a nonsingular tableau matrix that gives the exact steady state density for all values of damping is the implicit midpoint rule. Numerical experiments, comparing the implicit midpoint rule with Heun and leapfrog methods on nonlinear equations with additive or multiplicative noise, produce behavior similar to the linear case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research-in-progress paper reports preliminary findings of a study that is designed to identify characteristics of an expert in the discipline of Information Systems (IS). The paper delivers a formative research model to depict characteristics of an expert with three additive constructs, using concepts derived from psychology, knowledge management and social-behaviour research. The paper then explores the formation and application ‘expertise’ using four investigative questions in the context of System Evaluations. Data have been gathered from 220 respondents representing three medium sized companies in India, using the SAP Enterprise Resource Planning system. The paper summarizes planned data analyses in construct validation, model testing and model application. A validated construct of expertise of IS will have a wide range of implications for research and practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we pursue the task of aligning an ensemble of images in an unsupervised manner. This task has been commonly referred to as “congealing” in literature. A form of congealing, using a least-squares criteria, has been recently demonstrated to have desirable properties over conventional congealing. Least-squares congealing can be viewed as an extension of the Lucas & Kanade (LK)image alignment algorithm. It is well understood that the alignment performance for the LK algorithm, when aligning a single image with another, is theoretically and empirically equivalent for additive and compositional warps. In this paper we: (i) demonstrate that this equivalence does not hold for the extended case of congealing, (ii) characterize the inherent drawbacks associated with least-squares congealing when dealing with large numbers of images, and (iii) propose a novel method for circumventing these limitations through the application of an inverse-compositional strategy that maintains the attractive properties of the original method while being able to handle very large numbers of images.