140 resultados para conditional least squares
Resumo:
This paper presents a method for the estimation of thrust model parameters of uninhabited airborne systems using specific flight tests. Particular tests are proposed to simplify the estimation. The proposed estimation method is based on three steps. The first step uses a regression model in which the thrust is assumed constant. This allows us to obtain biased initial estimates of the aerodynamic coeficients of the surge model. In the second step, a robust nonlinear state estimator is implemented using the initial parameter estimates, and the model is augmented by considering the thrust as random walk. In the third step, the estimate of the thrust obtained by the observer is used to fit a polynomial model in terms of the propeller advanced ratio. We consider a numerical example based on Monte-Carlo simulations to quantify the sampling properties of the proposed estimator given realistic flight conditions.
Resumo:
Person re-identification is particularly challenging due to significant appearance changes across separate camera views. In order to re-identify people, a representative human signature should effectively handle differences in illumination, pose and camera parameters. While general appearance-based methods are modelled in Euclidean spaces, it has been argued that some applications in image and video analysis are better modelled via non-Euclidean manifold geometry. To this end, recent approaches represent images as covariance matrices, and interpret such matrices as points on Riemannian manifolds. As direct classification on such manifolds can be difficult, in this paper we propose to represent each manifold point as a vector of similarities to class representers, via a recently introduced form of Bregman matrix divergence known as the Stein divergence. This is followed by using a discriminative mapping of similarity vectors for final classification. The use of similarity vectors is in contrast to the traditional approach of embedding manifolds into tangent spaces, which can suffer from representing the manifold structure inaccurately. Comparative evaluations on benchmark ETHZ and iLIDS datasets for the person re-identification task show that the proposed approach obtains better performance than recent techniques such as Histogram Plus Epitome, Partial Least Squares, and Symmetry-Driven Accumulation of Local Features.
Resumo:
The thesis investigates “where were the auditors in asset securitizations”, a criticism of the audit profession before and after the onset of the global financial crisis (GFC). Asset securitizations increase audit complexity and audit risks, which are expected to increase audit fees. Using US bank holding company data from 2003 to 2009, this study examines the association between asset securitization risks and audit fees, and its changes during the global financial crisis. The main test is based on an ordinary least squares (OLS) model, which is adapted from the Fields et al. (2004) bank audit fee model. I employ a principal components analysis to address high correlations among asset securitization risks. Individual securitization risks are also separately tested. A suite of sensitivity tests indicate the results are robust. These include model alterations, sample variations, further controls in the tests, and correcting for the securitizer self-selection problem. A partial least squares (PLS) path modelling methodology is introduced as a separate test, which allows for high intercorrelations, self-selection correction, and sequential order hypotheses in one simultaneous model. The PLS results are consistent with the main results. The study finds significant and positive associations between securitization risks and audit fees. After the commencement of the global financial crisis in 2007, there was an increased focus on the role of audits on asset securitization risks resulting from bank failures; therefore I expect that auditors would become more sensitive to bank asset securitization risks after the commencement of the crisis. I find that auditors appear to focus on different aspects of asset securitization risks during the crisis and that auditors appear to charge a GFC premium for banks. Overall, the results support the view that auditors consider asset securitization risks and market changes, and adjust their audit effort and risk considerations accordingly.
Aligning off-balance sheet risk, on-balance sheet risk and audit fees: a PLS path modelling analysis
Resumo:
This study focuses on using the partial least squares (PLS) path modelling technique in archival auditing research by replicating the data and research questions from prior bank audit fee studies. PLS path modelling allows for inter-correlations among audit fee determinants by establishing latent constructs and multiple relationship paths in one simultaneous PLS path model. Endogeneity concerns about auditor choice can also be addressed with PLS path modelling. With a sample of US bank holding companies for the period 2003-2009, we examine the associations among on-balance sheet financial risks, off-balance sheet risks and audit fees, and also address the pervasive client size effect, and the effect of the self-selection of auditors. The results endorse the dominating effect of size on audit fees, both directly and indirectly via its impacts on other audit fee determinants. By simultaneously considering the self-selection of auditors, we still find audit fee premiums on Big N auditors, which is the second important factor on audit fee determination. On-balance-sheet financial risk measures in terms of capital adequacy, loan composition, earnings and asset quality performance have positive impacts on audit fees. After allowing for the positive influence of on-balance sheet financial risks and entity size on off-balance sheet risk, the off-balance sheet risk measure, SECRISK, is still positively associated with bank audit fees, both before and after the onset of the financial crisis. The consistent results from this study compared with prior literature provide supporting evidence and enhance confidence on the application of this new research technique in archival accounting studies.
Aligning off-balance sheet risk, on-balance sheet risk and audit fees: a PLS path modelling analysis
Resumo:
This study focuses on using the partial least squares (PLS) path modelling methodology in archival auditing research by replicating the data and research questions from prior bank audit fee studies. PLS path modelling allows for inter-correlations among audit fee determinants by establishing latent constructs and multiple relationship paths in one simultaneous PLS path model. Endogeneity concerns about auditor choice can also be addressed with PLS path modelling. With a sample of US bank holding companies for the period 2003-2009, we examine the associations among on-balance sheet financial risks, off-balance sheet risks and audit fees, and also address the pervasive client size effect, and the effect of the self-selection of auditors. The results endorse the dominating effect of size on audit fees, both directly and indirectly via its impacts on other audit fee determinants. By simultaneously considering the self-selection of auditors, we still find audit fee premiums on Big N auditors, which is the second important factor on audit fee determination. On-balance-sheet financial risk measures in terms of capital adequacy, loan composition, earnings and asset quality performance have positive impacts on audit fees. After allowing for the positive influence of on-balance sheet financial risks and entity size on off-balance sheet risk, the off-balance sheet risk measure, SECRISK, is still positively associated with bank audit fees, both before and after the onset of the financial crisis. The consistent results from this study compared with prior literature provide supporting evidence and enhance confidence on the application of this new research technique in archival accounting studies.
Resumo:
Healthcare organizations in all OECD countries have continued to undergo change. These changes have been found to have a negative effect on work engagement of nursing staff. While the extent to which nursing staff dealt with these changes has been documented in the literature, little is known of how they utilized their personal resources to deal with the consequences of these changes. This study will address this gap by integrating the Job Demands-Resources theoretical perspective with Positive Psychology, in particular, psychological capital (PsyCap). PsyCap is operationalized as a source of personal resources. Data were collected from 401 nurses from Australia and analyses were undertaken using Partial Least Squares modelling and moderation analysis. Two types of changes on the nursing work were identified. There was an increase in changes to the work environment of nursing. These changes, included increasing administrative workload and the amount of work, resulted in more job demands and job resources. On the other hand, another type of changes relate to reduction to training and management support, which resulted in less job demands. Nurses with more job demands utilized more job resources to address these increasing demands. We found PsyCap to be a crucial source of personal resources that has a moderating effect on the negative effects of job demands and role stress. PsyCap and job resources were both critical in enhancing the work engagement of nurses, as they encountered changes to nursing work. These findings provided empirical support for a positive psychological perspective of understanding nursing engagement.
Resumo:
This paper presents new schemes for recursive estimation of the state transition probabilities for hidden Markov models (HMM's) via extended least squares (ELS) and recursive state prediction error (RSPE) methods. Local convergence analysis for the proposed RSPE algorithm is shown using the ordinary differential equation (ODE) approach developed for the more familiar recursive output prediction error (RPE) methods. The presented scheme converges and is relatively well conditioned compared with the ...
Resumo:
In this paper new online adaptive hidden Markov model (HMM) state estimation schemes are developed, based on extended least squares (ELS) concepts and recursive prediction error (RPE) methods. The best of the new schemes exploit the idempotent nature of Markov chains and work with a least squares prediction error index, using a posterior estimates, more suited to Markov models then traditionally used in identification of linear systems.
Resumo:
This work deals with estimators for predicting when parametric roll resonance is going to occur in surface vessels. The roll angle of the vessel is modeled as a second-order linear oscillatory system with unknown parameters. Several algorithms are used to estimate the parameters and eigenvalues of the system based on data gathered experimentally on a 1:45 scale model of a tanker. Based on the estimated eigenvalues, the system predicts whether or not parametric roll occurred. A prediction accuracy of 100% is achieved for regular waves, and up to 87.5% for irregular waves.
Resumo:
The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
Diagnosis of articular cartilage pathology in the early disease stages using current clinical diagnostic imaging modalities is challenging, particularly because there is often no visible change in the tissue surface and matrix content, such as proteoglycans (PG). In this study, we propose the use of near infrared (NIR) spectroscopy to spatially map PG content in articular cartilage. The relationship between NIR spectra and reference data (PG content) obtained from histology of normal and artificially induced PG-depleted cartilage samples was investigated using principal component (PC) and partial least squares (PLS) regression analyses. Significant correlation was obtained between both data (R2 = 91.40%, p<0.0001). The resulting correlation was used to predict PG content from spectra acquired from whole joint sample, this was then employed to spatially map this component of cartilage across the intact sample. We conclude that NIR spectroscopy is a feasible tool for evaluating cartilage contents and mapping their distribution across mammalian joint
Resumo:
Phenols are well known noxious compounds, which are often found in various water sources. A novel analytical method has been researched and developed based on the properties of hemin–graphene hybrid nanosheets (H–GNs). These nanosheets were synthesized using a wet-chemical method, and they have peroxidase-like activity. Also, in the presence of H2O2, the nanosheets are efficient catalysts for the oxidation of the substrate, 4-aminoantipine (4-AP), and the phenols. The products of such an oxidation reaction are the colored quinone-imines (benzodiazepines). Importantly, these products enabled the differentiation of the three common phenols – pyrocatechol, resorcin and hydroquinone, with the use of a novel, spectroscopic method, which was developed for the simultaneous determination of the above three analytes. This spectroscopic method produced linear calibrations for the pyrocatechol (0.4–4.0 mg L−1), resorcin (0.2–2.0 mg L−1) and hydroquinone (0.8–8.0 mg L−1) analytes. In addition, kinetic and spectral data, obtained from the formation of the colored benzodiazepines, were used to establish multi-variate calibrations for the prediction of the three phenol analytes found in various kinds of water; partial least squares (PLS), principal component regression (PCR) and artificial neural network (ANN) models were used and the PLS model performed best.