44 resultados para Multivariate measurement model
em CentAUR: Central Archive University of Reading - UK
Resumo:
It has long been supposed that preference judgments between sets of to-be-considered possibilities are made by means of initially winnowing down the most promising-looking alternatives to form smaller “consideration sets” (Howard, 1963; Wright & Barbour, 1977). In preference choices with >2 options, it is standard to assume that a “consideration set”, based upon some simple criterion, is established to reduce the options available. Inferential judgments, in contrast, have more frequently been investigated in situations in which only two possibilities need to be considered (e.g., which of these two cities is the larger?) Proponents of the “fast and frugal” approach to decision-making suggest that such judgments are also made on the basis of limited, simple criteria. For example, if only one of two cities is recognized and the task is to judge which city has the larger population, the recognition heuristic states that the recognized city should be selected. A multinomial processing tree model is outlined which provides the basis for estimating the extent to which recognition is used as a criterion in establishing a consideration set for inferential judgments between three possible options.
Resumo:
Background: Robot-mediated therapies offer entirely new approaches to neurorehabilitation. In this paper we present the results obtained from trialling the GENTLE/S neurorehabilitation system assessed using the upper limb section of the Fugl-Meyer ( FM) outcome measure. Methods: We demonstrate the design of our clinical trial and its results analysed using a novel statistical approach based on a multivariate analytical model. This paper provides the rational for using multivariate models in robot-mediated clinical trials and draws conclusions from the clinical data gathered during the GENTLE/S study. Results: The FM outcome measures recorded during the baseline ( 8 sessions), robot-mediated therapy ( 9 sessions) and sling-suspension ( 9 sessions) was analysed using a multiple regression model. The results indicate positive but modest recovery trends favouring both interventions used in GENTLE/S clinical trial. The modest recovery shown occurred at a time late after stroke when changes are not clinically anticipated. Conclusion: This study has applied a new method for analysing clinical data obtained from rehabilitation robotics studies. While the data obtained during the clinical trial is of multivariate nature, having multipoint and progressive nature, the multiple regression model used showed great potential for drawing conclusions from this study. An important conclusion to draw from this paper is that this study has shown that the intervention and control phase both caused changes over a period of 9 sessions in comparison to the baseline. This might indicate that use of new challenging and motivational therapies can influence the outcome of therapies at a point when clinical changes are not expected. Further work is required to investigate the effects arising from early intervention, longer exposure and intensity of the therapies. Finally, more function-oriented robot-mediated therapies or sling-suspension therapies are needed to clarify the effects resulting from each intervention for stroke recovery.
Resumo:
Internal risk management models of the kind popularized by J. P. Morgan are now used widely by the world’s most sophisticated financial institutions as a means of measuring risk. Using the returns on three of the most popular futures contracts on the London International Financial Futures Exchange, in this paper we investigate the possibility of using multivariate generalized autoregressive conditional heteroscedasticity (GARCH) models for the calculation of minimum capital risk requirements (MCRRs). We propose a method for the estimation of the value at risk of a portfolio based on a multivariate GARCH model. We find that the consideration of the correlation between the contracts can lead to more accurate, and therefore more appropriate, MCRRs compared with the values obtained from a univariate approach to the problem.
Resumo:
Interest towards Enterprise Architecture (EA) has been increasing during the last few years. EA has been found to be a crucial aspect of business survival, and thus the importance of EA implementation success is also crucial. Current literature does not have a tool to be used to measure the success of EA implementation. In this paper, a tentative model for measuring success is presented and empirically validated in EA context. Results show that the success of EA implementation can be measured indirectly by measuring the achievement of the objectives set for the implementation. Results also imply that achieving individual's objectives do not necessarily mean that organisation's objectives are achieved. The presented Success Measurement Model can be used as basis for developing measurement metrics.
Resumo:
This article applies FIMIX-PLS segmentation methodology to detect and explore unanticipated reactions to organisational strategy among stakeholder segments. For many large organisations today, the tendency to apply a “one-size-fits-all” strategy to members of a stakeholder population, commonly driven by a desire for simplicity, efficiency and fairness, may actually result in unanticipated consequences amongst specific subgroups within the target population. This study argues that it is critical for organisations to understand the varying and potentially harmful effects of strategic actions across differing, and previously unidentified, segments within a stakeholder population. The case of a European revenue service that currently focuses its strategic actions on building trust and compliant behaviour amongst taxpayers is used as the context for this study. FIMIX-PLS analysis is applied to a sample of 501 individual taxpayers, while a novel PLS-based approach for assessing measurement model invariance that can be applied to both reflective and formative measures is also introduced for the purpose of multi-group comparisons. The findings suggest that individual taxpayers can be split into two equal-sized segments with highly differentiated characteristics and reactions to organisational strategy and communications. Compliant behaviour in the first segment (n = 223), labelled “relationships centred on trust,” is mainly driven through positive service experiences and judgements of competence, while judgements of benevolence lead to the unanticipated reaction of increasing distrust among this group. Conversely, compliant behaviour in the second segment (n = 278), labelled “relationships centred on distrust,” is driven by the reduction of fear and scepticism towards the revenue service, which is achieved through signalling benevolence, reduced enforcement and the lower incidence of negative stories. In this segment, the use of enforcement has the unanticipated and counterproductive effect of ultimately reducing compliant behaviour.
Resumo:
This paper uses a regime-switching approach to determine whether prices in the US stock, direct real estate and indirect real estate markets are driven by the presence of speculative bubbles. The results show significant evidence of the existence of periodically partially collapsing speculative bubbles in all three markets. A multivariate bubble model is then developed and implemented to evaluate whether the stock and real estate bubbles spill over into REITs. The underlying stock market bubble is found to be a stronger influence on the securitised real estate market bubble than that of the property market. Furthermore, the findings suggest a transmission of speculative bubbles from the direct real estate to the stock market, although this link is not present for the returns themselves.
Resumo:
The theta-logistic is a widely used generalisation of the logistic model of regulated biological processes which is used in particular to model population regulation. Then the parameter theta gives the shape of the relationship between per-capita population growth rate and population size. Estimation of theta from population counts is however subject to bias, particularly when there are measurement errors. Here we identify factors disposing towards accurate estimation of theta by simulation of populations regulated according to the theta-logistic model. Factors investigated were measurement error, environmental perturbation and length of time series. Large measurement errors bias estimates of theta towards zero. Where estimated theta is close to zero, the estimated annual return rate may help resolve whether this is due to bias. Environmental perturbations help yield unbiased estimates of theta. Where environmental perturbations are large, estimates of theta are likely to be reliable even when measurement errors are also large. By contrast where the environment is relatively constant, unbiased estimates of theta can only be obtained if populations are counted precisely Our results have practical conclusions for the design of long-term population surveys. Estimation of the precision of population counts would be valuable, and could be achieved in practice by repeating counts in at least some years. Increasing the length of time series beyond ten or 20 years yields only small benefits. if populations are measured with appropriate accuracy, given the level of environmental perturbation, unbiased estimates can be obtained from relatively short censuses. These conclusions are optimistic for estimation of theta. (C) 2008 Elsevier B.V All rights reserved.
Resumo:
The relationship between acrylamide and its precursors, namely, free asparagine and reducing sugars, was studied in cakes made from potato flake, wholemeal wheat, and wholemeal rye, cooked at 180 degreesC, from 5 to 60 min. Between 5 and 20 min, major losses of asparagine, water, and total reducing sugars were accompanied by large increases in acrylamide, which maximized in all three products between 25 and 30 min, followed by a slow linear reduction. Acrylamide formation did not occur to a large degree until the moisture contents of the cakes fell below 5%. Linear relationships were observed for acrylamide formation with the residual levels of asparagine and reducing sugars for all three food materials.
Resumo:
Mesospheric temperature inversions are well established observed phenomena, yet their properties remain the subject of ongoing research. Comparisons between Rayleigh-scatter lidar temperature measurements obtained by the University of Western Ontario's Purple Crow Lidar (42.9°N, 81.4°W) and the Canadian Middle Atmosphere Model are used to quantify the statistics of inversions. In both model and measurements, inversions occur most frequently in the winter and exhibit an average amplitude of ∼10 K. The model exhibits virtually no inversions in the summer, while the measurements show a strongly reduced frequency of occurrence with an amplitude about half that in the winter. A simple theory of mesospheric inversions based on wave saturation is developed, with no adjustable parameters. It predicts that the environmental lapse rate must be less than half the adiabatic lapse rate for an inversion to form, and it predicts the ratio of the inversion amplitude and thickness as a function of environmental lapse rate. Comparison of this prediction to the actual amplitude/thickness ratio using the lidar measurements shows good agreement between theory and measurements.
Resumo:
The concentrations of sulfate, black carbon (BC) and other aerosols in the Arctic are characterized by high values in late winter and spring (so-called Arctic Haze) and low values in summer. Models have long been struggling to capture this seasonality and especially the high concentrations associated with Arctic Haze. In this study, we evaluate sulfate and BC concentrations from eleven different models driven with the same emission inventory against a comprehensive pan-Arctic measurement data set over a time period of 2 years (2008–2009). The set of models consisted of one Lagrangian particle dispersion model, four chemistry transport models (CTMs), one atmospheric chemistry-weather forecast model and five chemistry climate models (CCMs), of which two were nudged to meteorological analyses and three were running freely. The measurement data set consisted of surface measurements of equivalent BC (eBC) from five stations (Alert, Barrow, Pallas, Tiksi and Zeppelin), elemental carbon (EC) from Station Nord and Alert and aircraft measurements of refractory BC (rBC) from six different campaigns. We find that the models generally captured the measured eBC or rBC and sulfate concentrations quite well, compared to previous comparisons. However, the aerosol seasonality at the surface is still too weak in most models. Concentrations of eBC and sulfate averaged over three surface sites are underestimated in winter/spring in all but one model (model means for January–March underestimated by 59 and 37 % for BC and sulfate, respectively), whereas concentrations in summer are overestimated in the model mean (by 88 and 44 % for July–September), but with overestimates as well as underestimates present in individual models. The most pronounced eBC underestimates, not included in the above multi-site average, are found for the station Tiksi in Siberia where the measured annual mean eBC concentration is 3 times higher than the average annual mean for all other stations. This suggests an underestimate of BC sources in Russia in the emission inventory used. Based on the campaign data, biomass burning was identified as another cause of the modeling problems. For sulfate, very large differences were found in the model ensemble, with an apparent anti-correlation between modeled surface concentrations and total atmospheric columns. There is a strong correlation between observed sulfate and eBC concentrations with consistent sulfate/eBC slopes found for all Arctic stations, indicating that the sources contributing to sulfate and BC are similar throughout the Arctic and that the aerosols are internally mixed and undergo similar removal. However, only three models reproduced this finding, whereas sulfate and BC are weakly correlated in the other models. Overall, no class of models (e.g., CTMs, CCMs) performed better than the others and differences are independent of model resolution.
Resumo:
This contribution closes this special issue of Hydrology and Earth System Sciences concerning the assessment of nitrogen dynamics in catchments across Europe within a semi-distributed Integrated Nitrogen model for multiple source assessment in Catchments (INCA). New developments in the understanding of the factors and processes determining the concentrations and loads of nitrogen are outlined. The ability of the INCA model to simulate the hydrological and nitrogen dynamics of different European ecosystems is assessed and the results of the first scenario analyses investigating the impacts of deposition, climatic and land-use change on the nitrogen dynamics are summarised. Consideration is given as to how well the model has performed as a generic too] for describing the nitrogen dynamics of European ecosystems across Arctic, Maritime. Continental and Mediterranean climates, its role in new research initiatives and future research requirements.
Resumo:
Ecological risk assessments must increasingly consider the effects of chemical mixtures on the environment as anthropogenic pollution continues to grow in complexity. Yet testing every possible mixture combination is impractical and unfeasible; thus, there is an urgent need for models that can accurately predict mixture toxicity from single-compound data. Currently, two models are frequently used to predict mixture toxicity from single-compound data: Concentration addition and independent action (IA). The accuracy of the predictions generated by these models is currently debated and needs to be resolved before their use in risk assessments can be fully justified. The present study addresses this issue by determining whether the IA model adequately described the toxicity of binary mixtures of five pesticides and other environmental contaminants (cadmium, chlorpyrifos, diuron, nickel, and prochloraz) each with dissimilar modes of action on the reproduction of the nematode Caenorhabditis elegans. In three out of 10 cases, the IA model failed to describe mixture toxicity adequately with significant or antagonism being observed. In a further three cases, there was an indication of synergy, antagonism, and effect-level-dependent deviations, respectively, but these were not statistically significant. The extent of the significant deviations that were found varied, but all were such that the predicted percentage effect seen on reproductive output would have been wrong by 18 to 35% (i.e., the effect concentration expected to cause a 50% effect led to an 85% effect). The presence of such a high number and variety of deviations has important implications for the use of existing mixture toxicity models for risk assessments, especially where all or part of the deviation is synergistic.
Resumo:
This contribution closes this special issue of Hydrology and Earth System Sciences concerning the assessment of nitrogen dynamics in catchments across Europe within a semi-distributed Integrated Nitrogen model for multiple source assessment in Catchments (INCA). New developments in the understanding of the factors and processes determining the concentrations and loads of nitrogen are outlined. The ability of the INCA model to simulate the hydrological and nitrogen dynamics of different European ecosystems is assessed and the results of the first scenario analyses investigating the impacts of deposition, climatic and land-use change on the nitrogen dynamics are summarised. Consideration is given as to how well the model has performed as a generic too] for describing the nitrogen dynamics of European ecosystems across Arctic, Maritime. Continental and Mediterranean climates, its role in new research initiatives and future research requirements.
Resumo:
The potential of the τ-ω model for retrieving the volumetric moisture content of bare and vegetated soil from dual polarisation passive microwave data acquired at single and multiple angles is tested. Measurement error and several additional sources of uncertainty will affect the theoretical retrieval accuracy. These include uncertainty in the soil temperature, the vegetation structure and consequently its microwave singlescattering albedo, and uncertainty in soil microwave emissivity based on its roughness. To test the effects of these uncertainties for simple homogeneous scenes, we attempt to retrieve soil moisture from a number of simulated microwave brightness temperature datasets generated using the τ-ω model. The uncertainties for each influence are estimated and applied to curves generated for typical scenarios, and an inverse model used to retrieve the soil moisture content, vegetation optical depth and soil temperature. The effect of each influence on the theoretical soil moisture retrieval limit is explored, the likelihood of each sensor configuration meeting user requirements is assessed, and the most effective means of improving moisture retrieval indicated.