159 resultados para SQUARES


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study focuses on using the partial least squares (PLS) path modelling technique in archival auditing research by replicating the data and research questions from prior bank audit fee studies. PLS path modelling allows for inter-correlations among audit fee determinants by establishing latent constructs and multiple relationship paths in one simultaneous PLS path model. Endogeneity concerns about auditor choice can also be addressed with PLS path modelling. With a sample of US bank holding companies for the period 2003-2009, we examine the associations among on-balance sheet financial risks, off-balance sheet risks and audit fees, and also address the pervasive client size effect, and the effect of the self-selection of auditors. The results endorse the dominating effect of size on audit fees, both directly and indirectly via its impacts on other audit fee determinants. By simultaneously considering the self-selection of auditors, we still find audit fee premiums on Big N auditors, which is the second important factor on audit fee determination. On-balance-sheet financial risk measures in terms of capital adequacy, loan composition, earnings and asset quality performance have positive impacts on audit fees. After allowing for the positive influence of on-balance sheet financial risks and entity size on off-balance sheet risk, the off-balance sheet risk measure, SECRISK, is still positively associated with bank audit fees, both before and after the onset of the financial crisis. The consistent results from this study compared with prior literature provide supporting evidence and enhance confidence on the application of this new research technique in archival accounting studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study focuses on using the partial least squares (PLS) path modelling methodology in archival auditing research by replicating the data and research questions from prior bank audit fee studies. PLS path modelling allows for inter-correlations among audit fee determinants by establishing latent constructs and multiple relationship paths in one simultaneous PLS path model. Endogeneity concerns about auditor choice can also be addressed with PLS path modelling. With a sample of US bank holding companies for the period 2003-2009, we examine the associations among on-balance sheet financial risks, off-balance sheet risks and audit fees, and also address the pervasive client size effect, and the effect of the self-selection of auditors. The results endorse the dominating effect of size on audit fees, both directly and indirectly via its impacts on other audit fee determinants. By simultaneously considering the self-selection of auditors, we still find audit fee premiums on Big N auditors, which is the second important factor on audit fee determination. On-balance-sheet financial risk measures in terms of capital adequacy, loan composition, earnings and asset quality performance have positive impacts on audit fees. After allowing for the positive influence of on-balance sheet financial risks and entity size on off-balance sheet risk, the off-balance sheet risk measure, SECRISK, is still positively associated with bank audit fees, both before and after the onset of the financial crisis. The consistent results from this study compared with prior literature provide supporting evidence and enhance confidence on the application of this new research technique in archival accounting studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Healthcare organizations in all OECD countries have continued to undergo change. These changes have been found to have a negative effect on work engagement of nursing staff. While the extent to which nursing staff dealt with these changes has been documented in the literature, little is known of how they utilized their personal resources to deal with the consequences of these changes. This study will address this gap by integrating the Job Demands-Resources theoretical perspective with Positive Psychology, in particular, psychological capital (PsyCap). PsyCap is operationalized as a source of personal resources. Data were collected from 401 nurses from Australia and analyses were undertaken using Partial Least Squares modelling and moderation analysis. Two types of changes on the nursing work were identified. There was an increase in changes to the work environment of nursing. These changes, included increasing administrative workload and the amount of work, resulted in more job demands and job resources. On the other hand, another type of changes relate to reduction to training and management support, which resulted in less job demands. Nurses with more job demands utilized more job resources to address these increasing demands. We found PsyCap to be a crucial source of personal resources that has a moderating effect on the negative effects of job demands and role stress. PsyCap and job resources were both critical in enhancing the work engagement of nurses, as they encountered changes to nursing work. These findings provided empirical support for a positive psychological perspective of understanding nursing engagement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional towns of the Kathmandu Valley boast a fine provision of public spaces in their neighbourhoods. Historically, a hierarchy of public space has been distributed over the entire town with each neighbourhood centered around more or less spacious public squares. However, rapid growth of these towns over the past decades has resulted in haphazard development of new urban areas with little provision of public space. Recent studies indicate that the loss of public space is a major consequence of the uncontrolled urban growth of the Kathmandu Valley and its new neighbourhoods. This paper reviews the current urban growth of the Kathmandu Valley and its impact on the development of public space in new neighbourhoods. The preliminary analysis of the case study of three new neighbourhoods shows that the formation and utilization of neighbourhood public space exhibit fundamental differences from those found in the traditional city cores. The following key issues are identified in this paper: a) Governance and regulations have been a challenge to regulate rapid urban growth; b) The current pattern of neighbourhood formation is found to be different from that of traditional neighbourhoods due to the changes with rapid urban development; c) Public spaces have been compromised in both planned and unplanned new neighbourhoods in terms of their quantity and quality; d) The changing provision of public space has contributed to its changing use and meaning; and e) The changing demographic composition, changing society and life style have had direct impact on the declining use of public space. Moreover, the management of public spaces remains a big challenge due to their changing nature and the changing governance. The current transformation public space does not appear to be conducive, and has led to adversely changing social environment of the new neighbourhoods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This review article discusses form-based planning an din details analise the following books: Stepehn Marshall (2012) Urban Coding and Planning (Routledge, New York, USA, 272pp. pISBN 1135689202). Emily Talen (2012) City Rules: How Regulations Affects Urban Form (Island Press, Washington DC, USA, 254 pp. ISBN 9781597266925). Richard Tomlinson (2012) Australia’s Unintended Cities: the Impact of Housing on Urban Development (CSIRO Publishing, Collingwood, Australia, 194pp. ISBN 9780643103771). The history of the city has been written and rewritten many times: the seminal works of Benevolo (1980) and Mumford (1989) reconstruct how settlements, particularly their urban form, have changed over centuries. Rowe and Koetter (1978), Kostof (1991, 1992), Krier (2003), and Rossi and Eisenmann (1982) address instead the components that shape the urban environment: the architect can aggregate and manipulate squares, streets, parks and public buildings to control urban design. Generally these studies aim to reveal the secret of the traditional city in contraposition to the contemporary townscape characterized by planning and zoning, which are generally regarded as problematic and sterile (Woodward, 2013). The ‘secret rules’ that have shaped our cities have a bearing on the relationship of spaces, mixed uses, public environments and walkability (Walters, 2011)...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents new schemes for recursive estimation of the state transition probabilities for hidden Markov models (HMM's) via extended least squares (ELS) and recursive state prediction error (RSPE) methods. Local convergence analysis for the proposed RSPE algorithm is shown using the ordinary differential equation (ODE) approach developed for the more familiar recursive output prediction error (RPE) methods. The presented scheme converges and is relatively well conditioned compared with the ...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper new online adaptive hidden Markov model (HMM) state estimation schemes are developed, based on extended least squares (ELS) concepts and recursive prediction error (RPE) methods. The best of the new schemes exploit the idempotent nature of Markov chains and work with a least squares prediction error index, using a posterior estimates, more suited to Markov models then traditionally used in identification of linear systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with estimators for predicting when parametric roll resonance is going to occur in surface vessels. The roll angle of the vessel is modeled as a second-order linear oscillatory system with unknown parameters. Several algorithms are used to estimate the parameters and eigenvalues of the system based on data gathered experimentally on a 1:45 scale model of a tanker. Based on the estimated eigenvalues, the system predicts whether or not parametric roll occurred. A prediction accuracy of 100% is achieved for regular waves, and up to 87.5% for irregular waves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnosis of articular cartilage pathology in the early disease stages using current clinical diagnostic imaging modalities is challenging, particularly because there is often no visible change in the tissue surface and matrix content, such as proteoglycans (PG). In this study, we propose the use of near infrared (NIR) spectroscopy to spatially map PG content in articular cartilage. The relationship between NIR spectra and reference data (PG content) obtained from histology of normal and artificially induced PG-depleted cartilage samples was investigated using principal component (PC) and partial least squares (PLS) regression analyses. Significant correlation was obtained between both data (R2 = 91.40%, p<0.0001). The resulting correlation was used to predict PG content from spectra acquired from whole joint sample, this was then employed to spatially map this component of cartilage across the intact sample. We conclude that NIR spectroscopy is a feasible tool for evaluating cartilage contents and mapping their distribution across mammalian joint

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phenols are well known noxious compounds, which are often found in various water sources. A novel analytical method has been researched and developed based on the properties of hemin–graphene hybrid nanosheets (H–GNs). These nanosheets were synthesized using a wet-chemical method, and they have peroxidase-like activity. Also, in the presence of H2O2, the nanosheets are efficient catalysts for the oxidation of the substrate, 4-aminoantipine (4-AP), and the phenols. The products of such an oxidation reaction are the colored quinone-imines (benzodiazepines). Importantly, these products enabled the differentiation of the three common phenols – pyrocatechol, resorcin and hydroquinone, with the use of a novel, spectroscopic method, which was developed for the simultaneous determination of the above three analytes. This spectroscopic method produced linear calibrations for the pyrocatechol (0.4–4.0 mg L−1), resorcin (0.2–2.0 mg L−1) and hydroquinone (0.8–8.0 mg L−1) analytes. In addition, kinetic and spectral data, obtained from the formation of the colored benzodiazepines, were used to establish multi-variate calibrations for the prediction of the three phenol analytes found in various kinds of water; partial least squares (PLS), principal component regression (PCR) and artificial neural network (ANN) models were used and the PLS model performed best.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactions between the anti-carcinogens, bendamustine (BDM) and dexamethasone (DXM), with bovine serum albumin (BSA) were investigated with the use of fluorescence and UV–vis spectroscopies under pseudo-physiological conditions (Tris–HCl buffer, pH 7.4). The static mechanism was responsible for the fluorescence quenching during the interactions; the binding formation constant of the BSA–BDM complex and the binding number were 5.14 × 105 L mol−1 and 1.0, respectively. Spectroscopic studies for the formation of BDM–BSA complex were interpreted with the use of multivariate curve resolution – alternating least squares (MCR–ALS), which supported the complex formation. The BSA samples treated with site markers (warfarin – site I and ibuprofen – site II) were reacted separately with BDM and DXM; while both anti-carcinogens bound to site I, the binding constants suggested that DXM formed a more stable complex. Relative concentration profiles and the fluorescence spectra associated with BDM, DXM and BSA, were recovered simultaneously from the full fluorescence excitation–emission data with the use of the parallel factor analysis (PARAFAC) method. The results confirmed that on addition of DXM to the BDM–BSA complex, the BDM was replaced and the DXM–BSA complex formed; free BDM was released. This finding may have consequences for the transport of these drugs during any anti-cancer treatment.