95 resultados para least weighted squares


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Context: The masses previously obtained for the X-ray binary 2S 0921-630 inferred a compact object that was either a high-mass neutron star or low-mass black-hole, but used a previously published value for the rotational broadening (v sin i) with large uncertainties. Aims: We aim to determine an accurate mass for the compact object through an improved measurement of the secondary star's projected equatorial rotational velocity. Methods: We have used UVES echelle spectroscopy to determine the v sin i of the secondary star (V395 Car) in the low-mass X-ray binary 2S 0921-630 by comparison to an artificially broadened spectral-type template star. In addition, we have also measured v sin i from a single high signal-to-noise ratio absorption line profile calculated using the method of Least-Squares Deconvolution (LSD). Results: We determine v sin i to lie between 31.3±0.5 km s-1 to 34.7±0.5 km s-1 (assuming zero and continuum limb darkening, respectively) in disagreement with previous results based on intermediate resolution spectroscopy obtained with the 3.6 m NTT. Using our revised v sin i value in combination with the secondary star's radial velocity gives a binary mass ratio of 0.281±0.034. Furthermore, assuming a binary inclination angle of 75° gives a compact object mass of 1.37±0.13 M_?. Conclusions: We find that using relatively low-resolution spectroscopy can result in systemic uncertainties in the measured v sin i values obtained using standard methods. We suggest the use of LSD as a secondary, reliable check of the results as LSD allows one to directly discern the shape of the absorption line profile. In the light of the new v sin i measurement, we have revised down the compact object's mass, such that it is now compatible with a canonical neutron star mass.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces the application of linear multivariate statistical techniques, including partial least squares (PLS), canonical correlation analysis (CCA) and reduced rank regression (RRR), into the area of Systems Biology. This new approach aims to extract the important proteins embedded in complex signal transduction pathway models.The analysis is performed on a model of intracellular signalling along the janus-associated kinases/signal transducers and transcription factors (JAK/STAT) and mitogen activated protein kinases (MAPK) signal transduction pathways in interleukin-6 (IL6) stimulated hepatocytes, which produce signal transducer and activator of transcription factor 3 (STAT3).A region of redundancy within the MAPK pathway that does not affect the STAT3 transcription was identified using CCA. This is the core finding of this analysis and cannot be obtained by inspecting the model by eye. In addition, RRR was found to isolate terms that do not significantly contribute to changes in protein concentrations, while the application of PLS does not provide such a detailed picture by virtue of its construction.This analysis has a similar objective to conventional model reduction techniques with the advantage of maintaining the meaning of the states prior to and after the reduction process. A significant model reduction is performed, with a marginal loss in accuracy, offering a more concise model while maintaining the main influencing factors on the STAT3 transcription.The findings offer a deeper understanding of the reaction terms involved, confirm the relevance of several proteins to the production of Acute Phase Proteins and complement existing findings regarding cross-talk between the two signalling pathways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A ranking method assigns to every weighted directed graph a (weak) ordering of the nodes. In this paper we axiomatize the ranking method that ranks the nodes according to their outflow using four independent axioms. Besides the well-known axioms of anonymity and positive responsiveness we introduce outflow monotonicity – meaning that in pairwise comparison between two nodes, a node is not doing worse in case its own outflow does not decrease and the other node’s outflow does not increase – and order preservation – meaning that adding two weighted digraphs such that the pairwise ranking between two nodes is the same in both weighted digraphs, then this is also their pairwise ranking in the ‘sum’ weighted digraph. The outflow ranking method generalizes the ranking by outdegree for directed graphs, and therefore also generalizes the ranking by Copeland score for tournaments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce three compact graph states that can be used to perform a measurement-based Toffoli gate. Given a weighted graph of six, seven, or eight qubits, we show that success probabilities of 1/4, 1/2, and 1, respectively, can be achieved. Our study puts a measurement-based version of this important quantum logic gate within the reach of current experiments. As the graphs are setup independent, they could be realized in a variety of systems, including linear optics and ion traps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The least-mean-fourth (LMF) algorithm is known for its fast convergence and lower steady state error, especially in sub-Gaussian noise environments. Recent work on normalised versions of the LMF algorithm has further enhanced its stability and performance in both Gaussian and sub-Gaussian noise environments. For example, the recently developed normalised LMF (XE-NLMF) algorithm is normalised by the mixed signal and error powers, and weighted by a fixed mixed-power parameter. Unfortunately, this algorithm depends on the selection of this mixing parameter. In this work, a time-varying mixed-power parameter technique is introduced to overcome this dependency. A convergence analysis, transient analysis, and steady-state behaviour of the proposed algorithm are derived and verified through simulations. An enhancement in performance is obtained through the use of this technique in two different scenarios. Moreover, the tracking analysis of the proposed algorithm is carried out in the presence of two sources of nonstationarities: (1) carrier frequency offset between transmitter and receiver and (2) random variations in the environment. Close agreement between analysis and simulation results is obtained. The results show that, unlike in the stationary case, the steady-state excess mean-square error is not a monotonically increasing function of the step size. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new hierarchical learning structure, namely the holistic triple learning (HTL), for extending the binary support vector machine (SVM) to multi-classification problems. For an N-class problem, a HTL constructs a decision tree up to a depth of A leaf node of the decision tree is allowed to be placed with a holistic triple learning unit whose generalisation abilities are assessed and approved. Meanwhile, the remaining nodes in the decision tree each accommodate a standard binary SVM classifier. The holistic triple classifier is a regression model trained on three classes, whose training algorithm is originated from a recently proposed implementation technique, namely the least-squares support vector machine (LS-SVM). A major novelty with the holistic triple classifier is the reduced number of support vectors in the solution. For the resultant HTL-SVM, an upper bound of the generalisation error can be obtained. The time complexity of training the HTL-SVM is analysed, and is shown to be comparable to that of training the one-versus-one (1-vs.-1) SVM, particularly on small-scale datasets. Empirical studies show that the proposed HTL-SVM achieves competitive classification accuracy with a reduced number of support vectors compared to the popular 1-vs-1 alternative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: This is an update of a previous review (McGuinness 2006). Hypertension and cognitive impairment are prevalent in older people. Hypertension is a direct risk factor for vascular dementia (VaD) and recent studies have suggested hypertension impacts upon prevalence of Alzheimer's disease (AD). Therefore does treatment of hypertension prevent cognitive decline?
Objectives: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease.
Search strategy: The Specialized Register of the Cochrane Dementia and Cognitive Improvement Group, The Cochrane Library, MEDLINE, EMBASE, PsycINFO, CINAHL, LILACS as well as many trials databases and grey literature sources were searched on 13 February 2008 using the terms: hypertens$ OR anti-hypertens$. Selection criteria: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months.
Data collection and analysis: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life.
Main results: Four trials including 15,936 hypertensive subjects were identified. Average age was 75.4 years. Mean blood pressure at entry across the studies was 171/86 mmHg. The combined result of the four trials reporting incidence of dementia indicated no significant difference between treatment and placebo (236/7767 versus 259/7660, Odds Ratio (OR) = 0.89, 95% CI 0.74, 1.07) and there was considerable heterogeneity between the trials. The combined results from the three trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.42, 95%CI 0.30, 0.53). Both systolic and diastolic blood pressure levels were reduced significantly in the three trials assessing this outcome (WMD = -10.22, 95% CI -10.78, -9.66 for systolic blood pressure, WMD = -4.28, 95% CI -4.58, -3.98 for diastolic blood pressure). Three trials reported adverse effects requiring discontinuation of treatment and the combined results indicated no significant difference (OR = 1.01, 95% CI 0.92, 1.11). When analysed separately, however, more patients on placebo in Syst Eur 1997 were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the four studies. Analysis of the included studies in this review was problematic as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen.
Authors' conclusions: There is no convincing evidence fromthe trials identified that blood pressure lowering in late-life prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients who received active treatment. This introduced bias. More robust results may be obtained by conducting a meta-analysis using individual patient data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Postal and electronic questionnaires are widely used for data collection in epidemiological studies but non-response reduces the effective sample size and can introduce bias. Finding ways to increase response to postal and electronic questionnaires would improve the quality of health research. Objectives: To identify effective strategies to increase response to postal and electronic questionnaires. Search strategy: We searched 14 electronic databases to February 2008 and manually searched the reference lists of relevant trials and reviews, and all issues of two journals. We contacted the authors of all trials or reviews to ask about unpublished trials. Where necessary, we also contacted authors to confirm methods of allocation used and to clarify results presented. We assessed the eligibility of each trial using pre-defined criteria. Selection criteria: Randomised controlled trials of methods to increase response to postal or electronic questionnaires. Data collection and analysis: We extracted data on the trial participants, the intervention, the number randomised to intervention and comparison groups and allocation concealment. For each strategy, we estimated pooled odds ratios (OR) and 95% confidence intervals (CI) in a random-effects model. We assessed evidence for selection bias using Egger's weighted regression method and Begg's rank correlation test and funnel plot. We assessed heterogeneity among trial odds ratios using a Chi 2 test and the degree of inconsistency between trial results was quantified using the I 2 statistic. Main results: Postal We found 481 eligible trials.The trials evaluated 110 different ways of increasing response to postal questionnaires.We found substantial heterogeneity among trial results in half of the strategies. The odds of response were at least doubled using monetary incentives (odds ratio 1.87; 95% CI 1.73 to 2.04; heterogeneity P < 0.00001, I 2 = 84%), recorded delivery (1.76; 95% CI 1.43 to 2.18; P = 0.0001, I 2 = 71%), a teaser on the envelope - e.g. a comment suggesting to participants that they may benefit if they open it (3.08; 95% CI 1.27 to 7.44) and a more interesting questionnaire topic (2.00; 95% CI 1.32 to 3.04; P = 0.06, I 2 = 80%). The odds of response were substantially higher with pre-notification (1.45; 95% CI 1.29 to 1.63; P < 0.00001, I 2 = 89%), follow-up contact (1.35; 95% CI 1.18 to 1.55; P < 0.00001, I 2 = 76%), unconditional incentives (1.61; 1.36 to 1.89; P < 0.00001, I 2 = 88%), shorter questionnaires (1.64; 95%CI 1.43 to 1.87; P < 0.00001, I 2 = 91%), providing a second copy of the questionnaire at follow up (1.46; 95% CI 1.13 to 1.90; P < 0.00001, I 2 = 82%), mentioning an obligation to respond (1.61; 95% CI 1.16 to 2.22; P = 0.98, I 2 = 0%) and university sponsorship (1.32; 95% CI 1.13 to 1.54; P < 0.00001, I 2 = 83%). The odds of response were also increased with non-monetary incentives (1.15; 95% CI 1.08 to 1.22; P < 0.00001, I 2 = 79%), personalised questionnaires (1.14; 95% CI 1.07 to 1.22; P < 0.00001, I 2 = 63%), use of hand-written addresses (1.25; 95% CI 1.08 to 1.45; P = 0.32, I 2 = 14%), use of stamped return envelopes as opposed to franked return envelopes (1.24; 95% CI 1.14 to 1.35; P < 0.00001, I 2 = 69%), an assurance of confidentiality (1.33; 95% CI 1.24 to 1.42) and first class outward mailing (1.11; 95% CI 1.02 to 1.21; P = 0.78, I 2 = 0%). The odds of response were reduced when the questionnaire included questions of a sensitive nature (0.94; 95% CI 0.88 to 1.00; P = 0.51, I 2 = 0%). Electronic: We found 32 eligible trials. The trials evaluated 27 different ways of increasing response to electronic questionnaires. We found substantial heterogeneity among trial results in half of the strategies. The odds of response were increased by more than a half using non-monetary incentives (1.72; 95% CI 1.09 to 2.72; heterogeneity P < 0.00001, I 2 = 95%), shorter e-questionnaires (1.73; 1.40 to 2.13; P = 0.08, I 2 = 68%), including a statement that others had responded (1.52; 95% CI 1.36 to 1.70), and a more interesting topic (1.85; 95% CI 1.52 to 2.26). The odds of response increased by a third using a lottery with immediate notification of results (1.37; 95% CI 1.13 to 1.65), an offer of survey results (1.36; 95% CI 1.15 to 1.61), and using a white background (1.31; 95% CI 1.10 to 1.56). The odds of response were also increased with personalised e-questionnaires (1.24; 95% CI 1.17 to 1.32; P = 0.07, I 2 = 41%), using a simple header (1.23; 95% CI 1.03 to 1.48), using textual representation of response categories (1.19; 95% CI 1.05 to 1.36), and giving a deadline (1.18; 95% CI 1.03 to 1.34). The odds of response tripled when a picture was included in an e-mail (3.05; 95% CI 1.84 to 5.06; P = 0.27, I 2 = 19%). The odds of response were reduced when "Survey" was mentioned in the e-mail subject line (0.81; 95% CI 0.67 to 0.97; P = 0.33, I 2 = 0%), and when the e-mail included a male signature (0.55; 95% CI 0.38 to 0.80; P = 0.96, I 2 = 0%). Authors' conclusions: Health researchers using postal and electronic questionnaires can increase response using the strategies shown to be effective in this systematic review. Copyright © 2009 The Cochrane Collaboration. Published by John Wiley & Sons, Ltd.


--------------------------------------------------------------------------------

Reaxys Database Information|

--------------------------------------------------------------------------------

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The conventional radial basis function (RBF) network optimization methods, such as orthogonal least squares or the two-stage selection, can produce a sparse network with satisfactory generalization capability. However, the RBF width, as a nonlinear parameter in the network, is not easy to determine. In the aforementioned methods, the width is always pre-determined, either by trial-and-error, or generated randomly. Furthermore, all hidden nodes share the same RBF width. This will inevitably reduce the network performance, and more RBF centres may then be needed to meet a desired modelling specification. In this paper we investigate a new two-stage construction algorithm for RBF networks. It utilizes the particle swarm optimization method to search for the optimal RBF centres and their associated widths. Although the new method needs more computation than conventional approaches, it can greatly reduce the model size and improve model generalization performance. The effectiveness of the proposed technique is confirmed by two numerical simulation examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is convenient and effective to solve nonlinear problems with a model that has a linear-in-the-parameters (LITP) structure. However, the nonlinear parameters (e.g. the width of Gaussian function) of each model term needs to be pre-determined either from expert experience or through exhaustive search. An alternative approach is to optimize them by a gradient-based technique (e.g. Newton’s method). Unfortunately, all of these methods still need a lot of computations. Recently, the extreme learning machine (ELM) has shown its advantages in terms of fast learning from data, but the sparsity of the constructed model cannot be guaranteed. This paper proposes a novel algorithm for automatic construction of a nonlinear system model based on the extreme learning machine. This is achieved by effectively integrating the ELM and leave-one-out (LOO) cross validation with our two-stage stepwise construction procedure [1]. The main objective is to improve the compactness and generalization capability of the model constructed by the ELM method. Numerical analysis shows that the proposed algorithm only involves about half of the computation of orthogonal least squares (OLS) based method. Simulation examples are included to confirm the efficacy and superiority of the proposed technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present extensive spectroscopic time series observations of the multiperiodic, rapidly rotating, delta Scuti star tau Pegasi. Information about the oscillations is contained within the patterns of line-profile variation of the star's blended absorption-line spectrum. We introduce the new technique of Doppler deconvolution with which to extract these patterns by modeling the intrinsic stellar spectrum and the broadening functions for each spectrum in the time series. Frequencies and modes of oscillation are identified from the variations using the technique of Fourier-Doppler imaging and a two-dimensional least-squares cleaning algorithm. We find a rich mode spectrum with degrees up to l = 20 and with frequencies below about 35 cycles day-1. Those modes with the largest amplitudes have frequencies that lie within a narrow band. We conclude that the observed spectrum can be explained if the modes of tau Peg propagate in the prograde direction with l ~= |m| and with frequencies that are about equal in the corotating frame of the star. We discuss the implications of these results for the prospect of delta Scuti seismology.