978 resultados para multivariate null intercepts model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are concerned with providing more empirical evidence on forecast failure, developing forecast models, and examining the impact of events such as audit reports. A joint consideration of classic financial ratios and relevant external indicators leads us to build a basic prediction model focused in non-financial Galician SMEs. Explanatory variables are relevant financial indicators from the viewpoint of the financial logic and financial failure theory. The paper explores three mathematical models: discriminant analysis, Logit, and linear multivariate regression. We conclude that, even though they both offer high explanatory and predictive abilities, Logit and MDA models should be used and interpreted jointly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In health related research it is common to have multiple outcomes of interest in a single study. These outcomes are often analysed separately, ignoring the correlation between them. One would expect that a multivariate approach would be a more efficient alternative to individual analyses of each outcome. Surprisingly, this is not always the case. In this article we discuss different settings of linear models and compare the multivariate and univariate approaches. We show that for linear regression models, the estimates of the regression parameters associated with covariates that are shared across the outcomes are the same for the multivariate and univariate models while for outcome-specific covariates the multivariate model performs better in terms of efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper offers a new approach to estimating time-varying covariance matrices in the framework of the diagonal-vech version of the multivariate GARCH(1,1) model. Our method is numerically feasible for large-scale problems, produces positive semidefinite conditional covariance matrices, and does not impose unrealistic a priori restrictions. We provide an empirical application in the context of international stock markets, comparing the nev^ estimator with a number of existing ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Univariate statistical control charts, such as the Shewhart chart, do not satisfy the requirements for process monitoring on a high volume automated fuel cell manufacturing line. This is because of the number of variables that require monitoring. The risk of elevated false alarms, due to the nature of the process being high volume, can present problems if univariate methods are used. Multivariate statistical methods are discussed as an alternative for process monitoring and control. The research presented is conducted on a manufacturing line which evaluates the performance of a fuel cell. It has three stages of production assembly that contribute to the final end product performance. The product performance is assessed by power and energy measurements, taken at various time points throughout the discharge testing of the fuel cell. The literature review performed on these multivariate techniques are evaluated using individual and batch observations. Modern techniques using multivariate control charts on Hotellings T2 are compared to other multivariate methods, such as Principal Components Analysis (PCA). The latter, PCA, was identified as the most suitable method. Control charts such as, scores, T2 and DModX charts, are constructed from the PCA model. Diagnostic procedures, using Contribution plots, for out of control points that are detected using these control charts, are also discussed. These plots enable the investigator to perform root cause analysis. Multivariate batch techniques are compared to individual observations typically seen on continuous processes. Recommendations, for the introduction of multivariate techniques that would be appropriate for most high volume processes, are also covered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Several researchers seek methods for the selection of homogeneous groups of animals in experimental studies, a fact justified because homogeneity is an indispensable prerequisite for casualization of treatments. The lack of robust methods that comply with statistical and biological principles is the reason why researchers use empirical or subjective methods, influencing their results. Objective: To develop a multivariate statistical model for the selection of a homogeneous group of animals for experimental research and to elaborate a computational package to use it. Methods: The set of echocardiographic data of 115 male Wistar rats with supravalvular aortic stenosis (AoS) was used as an example of model development. Initially, the data were standardized, and became dimensionless. Then, the variance matrix of the set was submitted to principal components analysis (PCA), aiming at reducing the parametric space and at retaining the relevant variability. That technique established a new Cartesian system into which the animals were allocated, and finally the confidence region (ellipsoid) was built for the profile of the animals’ homogeneous responses. The animals located inside the ellipsoid were considered as belonging to the homogeneous batch; those outside the ellipsoid were considered spurious. Results: The PCA established eight descriptive axes that represented the accumulated variance of the data set in 88.71%. The allocation of the animals in the new system and the construction of the confidence region revealed six spurious animals as compared to the homogeneous batch of 109 animals. Conclusion: The biometric criterion presented proved to be effective, because it considers the animal as a whole, analyzing jointly all parameters measured, in addition to having a small discard rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an alternative approach to obtaining a permanent equilibrium exchange rate (PEER), based on an unobserved components (UC) model. This approach offers a number of advantages over the conventional cointegration-based PEER. Firstly, we do not rely on the prerequisite that cointegration has to be found between the real exchange rate and macroeconomic fundamentals to obtain non-spurious long-run relationships and the PEER. Secondly, the impact that the permanent and transitory components of the macroeconomic fundamentals have on the real exchange rate can be modelled separately in the UC model. This is important for variables where the long and short-run effects may drive the real exchange rate in opposite directions, such as the relative government expenditure ratio. We also demonstrate that our proposed exchange rate models have good out-of sample forecasting properties. Our approach would be a useful technique for central banks to estimate the equilibrium exchange rate and to forecast the long-run movements of the exchange rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-world objects are often endowed with features that violate Gestalt principles. In our experiment, we examined the neural correlates of binding under conflict conditions in terms of the binding-by-synchronization hypothesis. We presented an ambiguous stimulus ("diamond illusion") to 12 observers. The display consisted of four oblique gratings drifting within circular apertures. Its interpretation fluctuates between bound ("diamond") and unbound (component gratings) percepts. To model a situation in which Gestalt-driven analysis contradicts the perceptually explicit bound interpretation, we modified the original diamond (OD) stimulus by speeding up one grating. Using OD and modified diamond (MD) stimuli, we managed to dissociate the neural correlates of Gestalt-related (OD vs. MD) and perception-related (bound vs. unbound) factors. Their interaction was expected to reveal the neural networks synchronized specifically in the conflict situation. The synchronization topography of EEG was analyzed with the multivariate S-estimator technique. We found that good Gestalt (OD vs. MD) was associated with a higher posterior synchronization in the beta-gamma band. The effect of perception manifested itself as reciprocal modulations over the posterior and anterior regions (theta/beta-gamma bands). Specifically, higher posterior and lower anterior synchronization supported the bound percept, and the opposite was true for the unbound percept. The interaction showed that binding under challenging perceptual conditions is sustained by enhanced parietal synchronization. We argue that this distributed pattern of synchronization relates to the processes of multistage integration ranging from early grouping operations in the visual areas to maintaining representations in the frontal networks of sensory memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Demand for home care services has increased considerably, along with the growing complexity of cases and variability among resources and providers. Designing services that guarantee co-ordination and integration for providers and levels of care is of paramount importance. The aim of this study is to determine the effectiveness of a new case-management based, home care delivery model which has been implemented in Andalusia (Spain). Methods Quasi-experimental, controlled, non-randomised, multi-centre study on the population receiving home care services comparing the outcomes of the new model, which included nurse-led case management, versus the conventional one. Primary endpoints: functional status, satisfaction and use of healthcare resources. Secondary endpoints: recruitment and caregiver burden, mortality, institutionalisation, quality of life and family function. Analyses were performed at base-line, and at two, six and twelve months. A bivariate analysis was conducted with the Student's t-test, Mann-Whitney's U, and the chi squared test. Kaplan-Meier and log-rank tests were performed to compare survival and institutionalisation. A multivariate analysis was performed to pinpoint factors that impact on improvement of functional ability. Results Base-line differences in functional capacity – significantly lower in the intervention group (RR: 1.52 95%CI: 1.05–2.21; p = 0.0016) – disappeared at six months (RR: 1.31 95%CI: 0.87–1.98; p = 0.178). At six months, caregiver burden showed a slight reduction in the intervention group, whereas it increased notably in the control group (base-line Zarit Test: 57.06 95%CI: 54.77–59.34 vs. 60.50 95%CI: 53.63–67.37; p = 0.264), (Zarit Test at six months: 53.79 95%CI: 49.67–57.92 vs. 66.26 95%CI: 60.66–71.86 p = 0.002). Patients in the intervention group received more physiotherapy (7.92 CI95%: 5.22–10.62 vs. 3.24 95%CI: 1.37–5.310; p = 0.0001) and, on average, required fewer home care visits (9.40 95%CI: 7.89–10.92 vs.11.30 95%CI: 9.10–14.54). No differences were found in terms of frequency of visits to A&E or hospital re-admissions. Furthermore, patients in the control group perceived higher levels of satisfaction (16.88; 95%CI: 16.32–17.43; range: 0–21, vs. 14.65 95%CI: 13.61–15.68; p = 0,001). Conclusion A home care service model that includes nurse-led case management streamlines access to healthcare services and resources, while impacting positively on patients' functional ability and caregiver burden, with increased levels of satisfaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Physicians need a specific risk-stratification tool to facilitate safe and cost-effective approaches to the management of patients with cancer and acute pulmonary embolism (PE). The objective of this study was to develop a simple risk score for predicting 30-day mortality in patients with PE and cancer by using measures readily obtained at the time of PE diagnosis. METHODS: Investigators randomly allocated 1,556 consecutive patients with cancer and acute PE from the international multicenter Registro Informatizado de la Enfermedad TromboEmbólica to derivation (67%) and internal validation (33%) samples. The external validation cohort for this study consisted of 261 patients with cancer and acute PE. Investigators compared 30-day all-cause mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. RESULTS: In the derivation sample, multivariable analyses produced the risk score, which contained six variables: age > 80 years, heart rate ≥ 110/min, systolic BP < 100 mm Hg, body weight < 60 kg, recent immobility, and presence of metastases. In the internal validation cohort (n = 508), the 22.2% of patients (113 of 508) classified as low risk by the prognostic model had a 30-day mortality of 4.4% (95% CI, 0.6%-8.2%) compared with 29.9% (95% CI, 25.4%-34.4%) in the high-risk group. In the external validation cohort, the 18% of patients (47 of 261) classified as low risk by the prognostic model had a 30-day mortality of 0%, compared with 19.6% (95% CI, 14.3%-25.0%) in the high-risk group. CONCLUSIONS: The developed clinical prediction rule accurately identifies low-risk patients with cancer and acute PE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose:Given the advances of gene therapy studies to cure RPE65-derived Leber Congenital Amaurosis (LCA) (clinical trials phase I) and the heterogeneity of the targeted patients both genetically and phenotypically, it is of prime importance to examine the rescue efficiency of gene transfer in different mutant contexts. Indeed, half of these mutations are missense mutations, leading to potential residual RPE65 activity. Consequently, we wanted to evaluate the effect on retinal activity and cone survival of lentivirus-mediated gene therapy in the R91W knock-in mouse model expressing the mutant Rpe65R91W gene (Samardzija et al. 2008), a mutation found in LCA patients. Notably we investigated whether if the therapeutic window is prolonged in comparison to null mutations. Methods:An HIV-1-derived lentiviral vector (LV) expressing either the GFP or the mouse Rpe65 cDNA under the control of a 0.8 kb fragment of the human Rpe65 promoter (R0.8) was produced by transient transfection of 293T cells. LV-R0.8-RPE65 or GFP was injected into 5-days-old (P5) or 1 month-old R91W mice. Functional rescue was assessed by ERG (1 and 4 months post-injection) and pupillary light response (PLR) recordings and cone survival by histological analysis. Results:Increased light sensitivity was detected by scotopic ERG in animals injected with LV-R0.8-RPE65 at both P5 and 1 month compared to GFP-treated animals or untreated mice. PLR was also improved in some eyes and histological analysis of cone markers showed that the density of cones reached the wild type level in the region of wt RPE65 delivery after treatment at P5. However, the rescue effect of the injection at 1 month was limited and attained 60% of the wild type level, but still more cones were observed in the treated area than in 1 month-old untreated Rpe65R91W mice. Conclusions:We were able to show that lentivirus-mediated Rpe65 gene transfer not only increases retinal activity of the Rpe65R91W mouse and survival of cones after treatment at P5 but also after treatment at 1 month. However even if the treatment at 1 month is more limited (60% of the wild type level) than treatment at P5, the amount of cone markers is increased compared to the proportion found at 1 month of age in untreated animals. This results contrast with the lack of cone rescue by treatment at 1 month of age in Rpe65-/- (Bemelmans et al, 2006). Thus patient suffering from R91W mutation might benefit from a prolonged therapeutic window.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard methods for the analysis of linear latent variable models oftenrely on the assumption that the vector of observed variables is normallydistributed. This normality assumption (NA) plays a crucial role inassessingoptimality of estimates, in computing standard errors, and in designinganasymptotic chi-square goodness-of-fit test. The asymptotic validity of NAinferences when the data deviates from normality has been calledasymptoticrobustness. In the present paper we extend previous work on asymptoticrobustnessto a general context of multi-sample analysis of linear latent variablemodels,with a latent component of the model allowed to be fixed across(hypothetical)sample replications, and with the asymptotic covariance matrix of thesamplemoments not necessarily finite. We will show that, under certainconditions,the matrix $\Gamma$ of asymptotic variances of the analyzed samplemomentscan be substituted by a matrix $\Omega$ that is a function only of thecross-product moments of the observed variables. The main advantage of thisis thatinferences based on $\Omega$ are readily available in standard softwareforcovariance structure analysis, and do not require to compute samplefourth-order moments. An illustration with simulated data in the context ofregressionwith errors in variables will be presented.