844 resultados para partial least square


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Photochemistry has made significant contributions to our understanding of many important natural processes as well as the scientific discoveries of the man-made world. The measurements from such studies are often complex and may require advanced data interpretation with the use of multivariate or chemometrics methods. In general, such methods have been applied successfully for data display, classification, multivariate curve resolution and prediction in analytical chemistry, environmental chemistry, engineering, medical research and industry. However, in photochemistry, by comparison, applications of such multivariate approaches were found to be less frequent although a variety of methods have been used, especially with spectroscopic photochemical applications. The methods include Principal Component Analysis (PCA; data display), Partial Least Squares (PLS; prediction), Artificial Neural Networks (ANN; prediction) and several models for multivariate curve resolution related to Parallel Factor Analysis (PARAFAC; decomposition of complex responses). Applications of such methods are discussed in this overview and typical examples include photodegradation of herbicides, prediction of antibiotics in human fluids (fluorescence spectroscopy), non-destructive in- and on-line monitoring (near infrared spectroscopy) and fast-time resolution of spectroscopic signals from photochemical reactions. It is also quite clear from the literature that the scope of spectroscopic photochemistry was enhanced by the application of chemometrics. To highlight and encourage further applications of chemometrics in photochemistry, several additional chemometrics approaches are discussed using data collected by the authors. The use of a PCA biplot is illustrated with an analysis of a matrix containing data on the performance of photocatalysts developed for water splitting and hydrogen production. In addition, the applications of the Multi-Criteria Decision Making (MCDM) ranking methods and Fuzzy Clustering are demonstrated with an analysis of water quality data matrix. Other examples of topics include the application of simultaneous kinetic spectroscopic methods for prediction of pesticides, and the use of response fingerprinting approach for classification of medicinal preparations. In general, the overview endeavours to emphasise the advantages of chemometrics' interpretation of multivariate photochemical data, and an Appendix of references and summaries of common and less usual chemometrics methods noted in this work, is provided. Crown Copyright © 2010.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper focuses on information sharing with key suppliers and seeks to explore the factors that might influence its extent and depth. We also investigate how information sharing affects a company’s performance with regards to resource usage, output, and flexibility. Drawing from transaction cost- and contingency theories, several factors, namely environmental uncertainty, demand uncertainty, dependency and, the product life cycle stage are proposed to explain the level of information shared with key suppliers. We develop a model where information sharing mediates the (contingent) factors and company performance. A mail survey was used to collect data from Finnish and Swedish companies. Partial Least Squares analysis was separately performed for each country (n=119, n=102). There was consistent evidence that environmental uncertainty, demand uncertainty and supplier/buyer dependency had explanatory power, whereas no significance was found for the product life cycle stage. The results also confirm previous studies by providing support for a positive relationship between information sharing and performance, where output performance was found to be the most strongly related

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper focuses on information sharing with key suppliers and seeks to explore the factors that might influence its extent and depth. We also investigate how information sharing affects a company’s performance with regards to resource usage, output, and flexibility. Drawing from transaction cost- and contingency theories, several factors, namely environmental uncertainty, demand uncertainty, dependency and, the product life cycle stage are proposed to explain the level of information shared with key suppliers. We develop a model where information sharing mediates the (contingent) factors and company performance. A mail survey was used to collect data from Finnish and Swedish companies. Partial Least Squares analysis was separately performed for each country (n=119, n=102). There was consistent evidence that environmental uncertainty, demand uncertainty and supplier/buyer dependency had explanatory power, whereas no significance was found for the relationship between product life cycle stage and information sharing. The results also confirm previous studies by providing support for a positive relationship between information sharing and performance, where output performance was found to be the most strongly related.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Differential pulse stripping voltammetry method(DPSV) was applied to the determination of three herbicides, ametryn, cyanatryn, and dimethametryn. It was found that their voltammograms overlapped strongly, and it is difficult to determine these compounds individually from their mixtures. With the aid of chemometrics, classical least squares(CLS), principal component regression(PCR) and partial least squares(PLS), voltammogram resolution and quantitative analysis of the synthetic mixtures of the three compounds were successfully performed. The proposed method was also applied to the analysis of some real samples with satisfactory results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research is one of several ongoing studies conducted within the IT Professional Services (ITPS) research programme at Queensland University of Technology (QUT). In 2003, ITPS introduced the IS-Impact model, a measurement model for measuring information systems success from the viewpoint of multiple stakeholders. The model, along with its instrument, is robust, simple, yet generalisable, and yields results that are comparable across time, stakeholders, different systems and system contexts. The IS-Impact model is defined as “a measure at a point in time, of the stream of net benefits from the Information System (IS), to date and anticipated, as perceived by all key-user-groups”. The model represents four dimensions, which are ‘Individual Impact’, ‘Organizational Impact’, ‘Information Quality’ and ‘System Quality’. The two Impact dimensions measure the up-to-date impact of the evaluated system, while the remaining two Quality dimensions act as proxies for probable future impacts (Gable, Sedera & Chan, 2008). To fulfil the goal of ITPS, “to develop the most widely employed model” this research re-validates and extends the IS-Impact model in a new context. This method/context-extension research aims to test the generalisability of the model by addressing known limitations of the model. One of the limitations of the model relates to the extent of external validity of the model. In order to gain wide acceptance, a model should be consistent and work well in different contexts. The IS-Impact model, however, was only validated in the Australian context, and packaged software was chosen as the IS understudy. Thus, this study is concerned with whether the model can be applied in another different context. Aiming for a robust and standardised measurement model that can be used across different contexts, this research re-validates and extends the IS-Impact model and its instrument to public sector organisations in Malaysia. The overarching research question (managerial question) of this research is “How can public sector organisations in Malaysia measure the impact of information systems systematically and effectively?” With two main objectives, the managerial question is broken down into two specific research questions. The first research question addresses the applicability (relevance) of the dimensions and measures of the IS-Impact model in the Malaysian context. Moreover, this research question addresses the completeness of the model in the new context. Initially, this research assumes that the dimensions and measures of the IS-Impact model are sufficient for the new context. However, some IS researchers suggest that the selection of measures needs to be done purposely for different contextual settings (DeLone & McLean, 1992, Rai, Lang & Welker, 2002). Thus, the first research question is as follows, “Is the IS-Impact model complete for measuring the impact of IS in Malaysian public sector organisations?” [RQ1]. The IS-Impact model is a multidimensional model that consists of four dimensions or constructs. Each dimension is represented by formative measures or indicators. Formative measures are known as composite variables because these measures make up or form the construct, or, in this case, the dimension in the IS-Impact model. These formative measures define different aspects of the dimension, thus, a measurement model of this kind needs to be tested not just on the structural relationship between the constructs but also the validity of each measure. In a previous study, the IS-Impact model was validated using formative validation techniques, as proposed in the literature (i.e., Diamantopoulos and Winklhofer, 2001, Diamantopoulos and Siguaw, 2006, Petter, Straub and Rai, 2007). However, there is potential for improving the validation testing of the model by adding more criterion or dependent variables. This includes identifying a consequence of the IS-Impact construct for the purpose of validation. Moreover, a different approach is employed in this research, whereby the validity of the model is tested using the Partial Least Squares (PLS) method, a component-based structural equation modelling (SEM) technique. Thus, the second research question addresses the construct validation of the IS-Impact model; “Is the IS-Impact model valid as a multidimensional formative construct?” [RQ2]. This study employs two rounds of surveys, each having a different and specific aim. The first is qualitative and exploratory, aiming to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. This survey was conducted in a state government in Malaysia. A total of 77 valid responses were received, yielding 278 impact statements. The results from the qualitative analysis demonstrate the applicability of most of the IS-Impact measures. The analysis also shows a significant new measure having emerged from the context. This new measure was added as one of the System Quality measures. The second survey is a quantitative survey that aims to operationalise the measures identified from the qualitative analysis and rigorously validate the model. This survey was conducted in four state governments (including the state government that was involved in the first survey). A total of 254 valid responses were used in the data analysis. Data was analysed using structural equation modelling techniques, following the guidelines for formative construct validation, to test the validity and reliability of the constructs in the model. This study is the first research that extends the complete IS-Impact model in a new context that is different in terms of nationality, language and the type of information system (IS). The main contribution of this research is to present a comprehensive, up-to-date IS-Impact model, which has been validated in the new context. The study has accomplished its purpose of testing the generalisability of the IS-Impact model and continuing the IS evaluation research by extending it in the Malaysian context. A further contribution is a validated Malaysian language IS-Impact measurement instrument. It is hoped that the validated Malaysian IS-Impact instrument will encourage related IS research in Malaysia, and that the demonstrated model validity and generalisability will encourage a cumulative tradition of research previously not possible. The study entailed several methodological improvements on prior work, including: (1) new criterion measures for the overall IS-Impact construct employed in ‘identification through measurement relations’; (2) a stronger, multi-item ‘Satisfaction’ construct, employed in ‘identification through structural relations’; (3) an alternative version of the main survey instrument in which items are randomized (rather than blocked) for comparison with the main survey data, in attention to possible common method variance (no significant differences between these two survey instruments were observed); (4) demonstrates a validation process of formative indexes of a multidimensional, second-order construct (existing examples mostly involved unidimensional constructs); (5) testing the presence of suppressor effects that influence the significance of some measures and dimensions in the model; and (6) demonstrates the effect of an imbalanced number of measures within a construct to the contribution power of each dimension in a multidimensional model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a solution to the problem of estimating the monotonous tendency of a slow-varying oscillating system. A recursive Prony Analysis (PA) scheme is developed which involves obtaining a dynamic model with parameters identified by implementing the forgetting factor recursive least square (FFRLS) method. A box threshold principle is proposed to separate the dominant components, which results in an accurate estimation of the trend of oscillating systems. Performance of the proposed PA is evaluated using real-time measurements when random noise and vibration effects are present. Moreover, the proposed method is used to estimate monotonous tendency of deck displacement to assist in a safe landing of an unmanned aerial vehicle (UAV). It is shown that the proposed method can estimate instantaneous mean deck satisfactorily, making it well suited for integration into ship-UAV approach and landing guidance systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective The aim of this study was to demonstrate the potential of near-infrared (NIR) spectroscopy for categorizing cartilage degeneration induced in animal models. Method Three models of osteoarthritic degeneration were induced in laboratory rats via one of the following methods: (i) menisectomy (MSX); (ii) anterior cruciate ligament transaction (ACLT); and (iii) intra-articular injection of mono-ido-acetete (1 mg) (MIA), in the right knee joint, with 12 rats per model group. After 8 weeks, the animals were sacrificed and tibial knee joints were collected. A custom-made nearinfrared (NIR) probe of diameter 5 mm was placed on the cartilage surface and spectral data were acquired from each specimen in the wavenumber range 4 000 – 12 500 cm−1. Following spectral data acquisition, the specimens were fixed and Safranin–O staining was performed to assess disease severity based on the Mankin scoring system. Using multivariate statistical analysis based on principal component analysis and partial least squares regression, the spectral data were then related to the Mankinscores of the samples tested. Results Mild to severe degenerative cartilage changes were observed in the subject animals. The ACLT models showed mild cartilage degeneration, MSX models moderate, and MIA severe cartilage degenerative changes both morphologically and histologically. Our result demonstrate that NIR spectroscopic information is capable of separating the cartilage samples into different groups relative to the severity of degeneration, with NIR correlating significantly with their Mankinscore (R2 = 88.85%). Conclusion We conclude that NIR is a viable tool for evaluating articularcartilage health and physical properties such as change in thickness with degeneration.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study is motivated by, and proceeds from, a central interest in the importance of evaluating IS service quality and adopts the IS ZOT SERVQUAL instrument (Kettinger & Lee, 2005) as its core theory base. This study conceptualises IS service quality as a multidimensional formative construct and seeks to answer the main research questions: “Is the IS service quality construct valid as a 1st-order formative, 2nd-order formative multidimensional construct?” Additionally, with the aim of validating the IS service quality construct within its nomological net, as in prior service marketing work, Satisfaction was hypothesised as its immediate consequence. With the goal of testing the above research question, IS service quality and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS), employing 219 valid responses, largely evidenced the validity of IS service quality as a multidimensional formative construct. The nomological validity of the IS service quality construct was also evidenced by demonstrating that 55% of Satisfaction was explained by the multidimensional formative IS service quality construct.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study proceeds from a central interest in the importance of systematically evaluating operational large-scale integrated information systems (IS) in organisations. The study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2009). The track espouses programmatic research having the principles of incrementalism, tenacity, holism and generalisability through replication and extension research strategies. Track efforts have yielded the bicameral IS-Impact measurement model; the ‘impact’ half includes Organisational-Impact and Individual-Impact dimensions; the ‘quality’ half includes System-Quality and Information-Quality dimensions. Akin to Gregor’s (2006) analytic theory, the ISImpact model is conceptualised as a formative, multidimensional index and is defined as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (Gable et al., 2008, p: 381). The study adopts the IS-Impact model (Gable, et al., 2008) as its core theory base. Prior work within the IS-Impact track has been consciously constrained to Financial IS for their homogeneity. This study adopts a context-extension strategy (Berthon et al., 2002) with the aim "to further validate and extend the IS-Impact measurement model in a new context - i.e. a different IS - Human Resources (HR)". The overarching research question is: "How can the impacts of large-scale integrated HR applications be effectively and efficiently benchmarked?" This managerial question (Cooper & Emory, 1995) decomposes into two more specific research questions – In the new HR context: (RQ1): "Is the IS-Impact model complete?" (RQ2): "Is the ISImpact model valid as a 1st-order formative, 2nd-order formative multidimensional construct?" The study adhered to the two-phase approach of Gable et al. (2008) to hypothesise and validate a measurement model. The initial ‘exploratory phase’ employed a zero base qualitative approach to re-instantiating the IS-Impact model in the HR context. The subsequent ‘confirmatory phase’ sought to validate the resultant hypothesised measurement model against newly gathered quantitative data. The unit of analysis for the study is the application, ‘ALESCO’, an integrated large-scale HR application implemented at Queensland University of Technology (QUT), a large Australian university (with approximately 40,000 students and 5000 staff). Target respondents of both study phases were ALESCO key-user-groups: strategic users, management users, operational users and technical users, who directly use ALESCO or its outputs. An open-ended, qualitative survey was employed in the exploratory phase, with the objective of exploring the completeness and applicability of the IS-Impact model’s dimensions and measures in the new context, and to conceptualise any resultant model changes to be operationalised in the confirmatory phase. Responses from 134 ALESCO users to the main survey question, "What do you consider have been the impacts of the ALESCO (HR) system in your division/department since its implementation?" were decomposed into 425 ‘impact citations.’ Citation mapping using a deductive (top-down) content analysis approach instantiated all dimensions and measures of the IS-Impact model, evidencing its content validity in the new context. Seeking to probe additional (perhaps negative) impacts; the survey included the additional open question "In your opinion, what can be done better to improve the ALESCO (HR) system?" Responses to this question decomposed into a further 107 citations which in the main did not map to IS-Impact, but rather coalesced around the concept of IS-Support. Deductively drawing from relevant literature, and working inductively from the unmapped citations, the new ‘IS-Support’ construct, including the four formative dimensions (i) training, (ii) documentation, (iii) assistance, and (iv) authorisation (each having reflective measures), was defined as: "a measure at a point in time, of the support, the [HR] information system key-user groups receive to increase their capabilities in utilising the system." Thus, a further goal of the study became validation of the IS-Support construct, suggesting the research question (RQ3): "Is IS-Support valid as a 1st-order reflective, 2nd-order formative multidimensional construct?" With the aim of validating IS-Impact within its nomological net (identification through structural relations), as in prior work, Satisfaction was hypothesised as its immediate consequence. The IS-Support construct having derived from a question intended to probe IS-Impacts, too was hypothesised as antecedent to Satisfaction, thereby suggesting the research question (RQ4): "What is the relative contribution of IS-Impact and IS-Support to Satisfaction?" With the goal of testing the above research questions, IS-Impact, IS-Support and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS) structural equation modelling employing 221 valid responses largely evidenced the validity of the commencing IS-Impact model in the HR context. ISSupport too was validated as operationalised (including 11 reflective measures of its 4 formative dimensions). IS-Support alone explained 36% of Satisfaction; IS-Impact alone 70%; in combination both explaining 71% with virtually all influence of ISSupport subsumed by IS-Impact. Key study contributions to research include: (1) validation of IS-Impact in the HR context, (2) validation of a newly conceptualised IS-Support construct as important antecedent of Satisfaction, and (3) validation of the redundancy of IS-Support when gauging IS-Impact. The study also makes valuable contributions to practice, the research track and the sponsoring organisation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a need for an accurate real-time quantitative system that would enhance decision-making in the treatment of osteoarthritis. To achieve this objective, significant research is required that will enable articular cartilage properties to be measured and categorized for health and functionality without the need for laboratory tests involving biopsies for pathological evaluation. Such a system would provide the capability of access to the internal condition of the cartilage matrix and thus extend the vision-based arthroscopy that is currently used beyond the subjective evaluation of surgeons. The system required must be able to non-destructively probe the entire thickness of the cartilage and its immediate subchondral bone layer. In this thesis, near infrared spectroscopy is investigated for the purpose mentioned above. The aim is to relate it to the structure and load bearing properties of the cartilage matrix to the near infrared absorption spectrum and establish functional relationships that will provide objective, quantitative and repeatable categorization of cartilage condition outside the area of visible degradation in a joint. Based on results from traditional mechanical testing, their innovative interpretation and relationship with spectroscopic data, new parameters were developed. These were then evaluated for their consistency in discriminating between healthy viable and degraded cartilage. The mechanical and physico-chemical properties were related to specific regions of the near infrared absorption spectrum that were identified as part of the research conducted for this thesis. The relationships between the tissue's near infrared spectral response and the new parameters were modeled using multivariate statistical techniques based on partial least squares regression (PLSR). With significantly high levels of statistical correlation, the modeled relationships were demonstrated to possess considerable potential in predicting the properties of unknown tissue samples in a quick and non-destructive manner. In order to adapt near infrared spectroscopy for clinical applications, a balance between probe diameter and the number of active transmit-receive optic fibres must be optimized. This was achieved in the course of this research, resulting in an optimal probe configuration that could be adapted for joint tissue evaluation. Furthermore, as a proof-of-concept, a protocol for obtaining the new parameters from the near infrared absorption spectra of cartilage was developed and implemented in a graphical user interface (GUI)-based software, and used to assess cartilage-on-bone samples in vitro. This conceptual implementation has been demonstrated, in part by the individual parametric relationship with the near infrared absorption spectrum, the capacity of the proposed system to facilitate real-time, non-destructive evaluation of cartilage matrix integrity. In summary, the potential of the optical near infrared spectroscopy for evaluating articular cartilage and bone laminate has been demonstrated in this thesis. The approach could have a spin-off for other soft tissues and organs of the body. It builds on the earlier work of the group at QUT, enhancing the near infrared component of the ongoing research on developing a tool for cartilage evaluation that goes beyond visual and subjective methods.