283 resultados para linear predictive coding (LPC)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Background The quantum increases in home Internet access and available online health information with limited control over information quality highlight the necessity of exploring decision making processes in accessing and using online information, specifically in relation to children who do not make their health decisions. Objectives To understand the processes explaining parents’ decisions to use online health information for child health care. Methods Parents (N = 391) completed an initial questionnaire assessing the theory of planned behaviour constructs of attitude, subjective norm, and perceived behavioural control, as well as perceived risk, group norm, and additional demographic factors. Two months later, 187 parents completed a follow-up questionnaire assessing their decisions to use online information for their child’s health care, specifically to 1) diagnose and/or treat their child’s suspected medical condition/illness and 2) increase understanding about a diagnosis or treatment recommended by a health professional. Results Hierarchical multiple regression showed that, for both behaviours, attitude, subjective norm, perceived behavioural control, (less) perceived risk, group norm, and (non) medical background were the significant predictors of intention. For parents’ use of online child health information, for both behaviours, intention was the sole significant predictor of behaviour. The findings explain 77% of the variance in parents’ intention to treat/diagnose a child health problem and 74% of the variance in their intentions to increase their understanding about child health concerns. Conclusions Understanding parents’ socio-cognitive processes that guide their use of online information for child health care is important given the increase in Internet usage and the sometimes-questionable quality of health information provided online. Findings highlight parents’ thirst for information; there is an urgent need for health professionals to provide parents with evidence-based child health websites in addition to general population education on how to evaluate the quality of online health information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed space-time coding (DSTC) exploits the concept of cooperative diversity and space-time coding to offer a powerful bandwidth efficient solution with improved diversity. In this paper, we evaluate the performance of DSTC with slotted amplify-and-forward protocol (SAF). Relay nodes between the source and the destination nodes are grouped into two relay clusters based on their respective locations and these relay clusters cooperate to transmit the space-time coded signal to the destination node in different time frames. We further extend the proposed Slotted-DSTC to Slotted DSTC with redundant code (Slotted-DSTC-R) protocol where the relay nodes in both relay clusters forward the same space-time coded signal to the destination node to achieve a higher diversity order.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1991, McNabb introduced the concept of mean action time (MAT) as a finite measure of the time required for a diffusive process to effectively reach steady state. Although this concept was initially adopted by others within the Australian and New Zealand applied mathematics community, it appears to have had little use outside this region until very recently, when in 2010 Berezhkovskii and coworkers rediscovered the concept of MAT in their study of morphogen gradient formation. All previous work in this area has been limited to studying single–species differential equations, such as the linear advection–diffusion–reaction equation. Here we generalise the concept of MAT by showing how the theory can be applied to coupled linear processes. We begin by studying coupled ordinary differential equations and extend our approach to coupled partial differential equations. Our new results have broad applications including the analysis of models describing coupled chemical decay and cell differentiation processes, amongst others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Linear adaptive channel equalization using the least mean square (LMS) algorithm and the recursive least-squares(RLS) algorithm for an innovative multi-user (MU) MIMOOFDM wireless broadband communications system is proposed. The proposed equalization method adaptively compensates the channel impairments caused by frequency selectivity in the propagation environment. Simulations for the proposed adaptive equalizer are conducted using a training sequence method to determine optimal performance through a comparative analysis. Results show an improvement of 0.15 in BER (at a SNR of 16 dB) when using Adaptive Equalization and RLS algorithm compared to the case in which no equalization is employed. In general, adaptive equalization using LMS and RLS algorithms showed to be significantly beneficial for MU-MIMO-OFDM systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Significant wheel-rail dynamic forces occur because of imperfections in the wheels and/or rail. One of the key responses to the transmission of these forces down through the track is impact force on the sleepers. Dynamic analysis of nonlinear systems is very complicated and does not lend itself easily to a classical solution of multiple equations. Trying to deduce the behaviour of track components from experimental data is very difficult because such data is hard to obtain and applies to only the particular conditions of the track being tested. The finite element method can be the best solution to this dilemma. This paper describes a finite element model using the software package ANSYS for various sized flat defects in the tread of a wheel rolling at a typical speed on heavy haul track. The paper explores the dynamic response of a prestressed concrete sleeper to these defects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter represents the analytical solution of two-dimensional linear stretching sheet problem involving a non-Newtonian liquid and suction by (a) invoking the boundary layer approximation and (b) using this result to solve the stretching sheet problem without using boundary layer approximation. The basic boundary layer equations for momentum, which are non-linear partial differential equations, are converted into non-linear ordinary differential equations by means of similarity transformation. The results reveal a new analytical procedure for solving the boundary layer equations arising in a linear stretching sheet problem involving a non-Newtonian liquid (Walters’ liquid B). The present study throws light on the analytical solution of a class of boundary layer equations arising in the stretching sheet problem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Informed by current trauma literature, this study explored the relationships between Posttraumatic Growth (PTG; Tedeschi & Calhoun, 1995), Posttraumatic Depreciation (PTD; Cann et al., 2010) and other post-trauma outcomes including well-being, psychological flourishing, and psychological distress. The predictive utility of PTG and PTD was also examined. The sample comprised 104 trauma survivors who completed the Posttraumatic Growth Inventory – 42 (Baker et al., 2008) and several outcome measures. As expected, PTD showed strong linear correlations with well-being, flourishing and distress, and emerged as a significant predictor of scores on such measures. However, PTG showed negligible correlations with well-being, flourishing and distress. This reaffirms that PTG and PTD measure conceptually distinct and independent dimensions of experience, which has implications for therapeutic practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A routine activity for a sports dietitian is to estimate energy and nutrient intake from an athlete's self-reported food intake. Decisions made by the dietitian when coding a food record are a source of variability in the data. The aim of the present study was to determine the variability in estimation of the daily energy and key nutrient intakes of elite athletes, when experienced coders analyzed the same food record using the same database and software package. Seven-day food records from a dietary survey of athletes in the 1996 Australian Olympic team were randomly selected to provide 13 sets of records, each set representing the self-reported food intake of an endurance, team, weight restricted, and sprint/power athlete. Each set was coded by 3-5 members of Sports Dietitians Australia, making a total of 52 athletes, 53 dietitians, and 1456 athlete-days of data. We estimated within- and between- athlete and dietitian variances for each dietary nutrient using mixed modeling, and we combined the variances to express variability as a coefficient of variation (typical variation as a percent of the mean). Variability in the mean of 7-day estimates of a nutrient was 2- to 3-fold less than that of a single day. The variability contributed by the coder was less than the true athlete variability for a 1-day record but was of similar magnitude for a 7-day record. The most variable nutrients (e.g., vitamin C, vitamin A, cholesterol) had approximately 3-fold more variability than least variable nutrients (e.g., energy, carbohydrate, magnesium). These athlete and coder variabilities need to be taken into account in dietary assessment of athletes for counseling and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To recognize faces in video, face appearances have been widely modeled as piece-wise local linear models which linearly approximate the smooth yet non-linear low dimensional face appearance manifolds. The choice of representations of the local models is crucial. Most of the existing methods learn each local model individually meaning that they only anticipate variations within each class. In this work, we propose to represent local models as Gaussian distributions which are learned simultaneously using the heteroscedastic probabilistic linear discriminant analysis (PLDA). Each gallery video is therefore represented as a collection of such distributions. With the PLDA, not only the within-class variations are estimated during the training, the separability between classes is also maximized leading to an improved discrimination. The heteroscedastic PLDA itself is adapted from the standard PLDA to approximate face appearance manifolds more accurately. Instead of assuming a single global within-class covariance, the heteroscedastic PLDA learns different within-class covariances specific to each local model. In the recognition phase, a probe video is matched against gallery samples through the fusion of point-to-model distances. Experiments on the Honda and MoBo datasets have shown the merit of the proposed method which achieves better performance than the state-of-the-art technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current study examined the structure of the volunteer functions inventory within a sample of older individuals (N = 187). The career items were replaced with items examining the concept of continuity of work, a potentially more useful and relevant concept for this population. Factor analysis supported a four factor solution, with values, social and continuity emerging as single factors and enhancement and protective items loading together on a single factor. Understanding items did not load highly on any factor. The values and continuity functions were the only dimensions to emerge as predictors of intention to volunteer. This research has important implications for understanding the motivation of older adults to engage in contemporary volunteering settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an Image Based Visual Servo control design for Fixed Wing Unmanned Aerial Vehicles tracking locally linear infrastructure in the presence of wind using a body fixed imaging sensor. Visual servoing offers improved data collection by posing the tracking task as one of controlling a feature as viewed by the inspection sensor, although is complicated by the introduction of wind as aircraft heading and course angle no longer align. In this work it is shown that the effects of wind alter the desired line angle required for continuous tracking to equal the wind correction angle as would be calculated to set a desired course. A control solution is then sort by linearizing the interaction matrix about the new feature pose such that kinematics of the feature can be augmented with the lateral dynamics of the aircraft, from which a state feedback control design is developed. Simulation results are presented comparing no compensation, integral control and the proposed controller using the wind correction angle, followed by an assessment of response to atmospheric disturbances in the form of turbulence and wind gusts