928 resultados para Residual-based tests
Resumo:
The biomisation method is used to reconstruct Latin American vegetation at 6000±500 and 18 000±1000 radiocarbon years before present (14C yr BP) from pollen data. Tests using modern pollen data from 381 samples derived from 287 locations broadly reproduce potential natural vegetation. The strong temperature gradient associated with the Andes is recorded by a transition from high altitude cool grass/shrubland and cool mixed forest to mid-altitude cool temperate rain forest, to tropical dry, seasonal and rain forest at low altitudes. Reconstructed biomes from a number of sites do not match the potential vegetation due to local factors such as human impact, methodological artefacts and mechanisms of pollen representivity of the parent vegetation.
Resumo:
We consider tests of forecast encompassing for probability forecasts, for both quadratic and logarithmic scoring rules. We propose test statistics for the null of forecast encompassing, present the limiting distributions of the test statistics, and investigate the impact of estimating the forecasting models' parameters on these distributions. The small-sample performance is investigated, in terms of small numbers of forecasts and model estimation sample sizes. We show the usefulness of the tests for the evaluation of recession probability forecasts from logit models with different leading indicators as explanatory variables, and for evaluating survey-based probability forecasts.
Resumo:
Tests for business cycle asymmetries are developed for Markov-switching autoregressive models. The tests of deepness, steepness, and sharpness are Wald statistics, which have standard asymptotics. For the standard two-regime model of expansions and contractions, deepness is shown to imply sharpness (and vice versa), whereas the process is always nonsteep. Two and three-state models of U.S. GNP growth are used to illustrate the approach, along with models of U.S. investment and consumption growth. The robustness of the tests to model misspecification, and the effects of regime-dependent heteroscedasticity, are investigated.
Resumo:
Techniques are proposed for evaluating forecast probabilities of events. The tools are especially useful when, as in the case of the Survey of Professional Forecasters (SPF) expected probability distributions of inflation, recourse cannot be made to the method of construction in the evaluation of the forecasts. The tests of efficiency and conditional efficiency are applied to the forecast probabilities of events of interest derived from the SPF distributions, and supplement a whole-density evaluation of the SPF distributions based on the probability integral transform approach.
Resumo:
Aim This paper documents reconstructions of the vegetation patterns in Australia, Southeast Asia and the Pacific (SEAPAC region) in the mid-Holocene and at the last glacial maximum (LGM). Methods Vegetation patterns were reconstructed from pollen data using an objective biomization scheme based on plant functional types. The biomization scheme was first tested using 535 modern pollen samples from 377 sites, and then applied unchanged to fossil pollen samples dating to 6000 ± 500 or 18,000 ± 1000 14C yr bp. Results 1. Tests using surface pollen sample sites showed that the biomization scheme is capable of reproducing the modern broad-scale patterns of vegetation distribution. The north–south gradient in temperature, reflected in transitions from cool evergreen needleleaf forest in the extreme south through temperate rain forest or wet sclerophyll forest (WSFW) and into tropical forests, is well reconstructed. The transitions from xerophytic through sclerophyll woodlands and open forests to closed-canopy forests, which reflect the gradient in plant available moisture from the continental interior towards the coast, are reconstructed with less geographical precision but nevertheless the broad-scale pattern emerges. 2. Differences between the modern and mid-Holocene vegetation patterns in mainland Australia are comparatively small and reflect changes in moisture availability rather than temperature. In south-eastern Australia some sites show a shift towards more moisture-stressed vegetation in the mid-Holocene with xerophytic woods/scrub and temperate sclerophyll woodland and shrubland at sites characterized today by WSFW or warm-temperate rain forest (WTRF). However, sites in the Snowy Mountains, on the Southern Tablelands and east of the Great Dividing Range have more moisture-demanding vegetation in the mid-Holocene than today. South-western Australia was slightly drier than today. The single site in north-western Australia also shows conditions drier than today in the mid-Holocene. Changes in the tropics are also comparatively small, but the presence of WTRF and tropical deciduous broadleaf forest and woodland in the mid-Holocene, in sites occupied today by cool-temperate rain forest, indicate warmer conditions. 3. Expansion of xerophytic vegetation in the south and tropical deciduous broadleaf forest and woodland in the north indicate drier conditions across mainland Australia at the LGM. None of these changes are informative about the degree of cooling. However the evidence from the tropics, showing lowering of the treeline and forest belts, indicates that conditions were between 1 and 9 °C (depending on elevation) colder. The encroachment of tropical deciduous broadleaf forest and woodland into lowland evergreen broadleaf forest implies greater aridity. Main conclusions This study provides the first continental-scale reconstruction of mid-Holocene and LGM vegetation patterns from Australia, Southeast Asia and the Pacific (SEAPAC region) using an objective biomization scheme. These data will provide a benchmark for evaluation of palaeoclimate simulations within the framework of the Palaeoclimate Modelling Intercomparison Project.
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
This paper proposes and implements a new methodology for forecasting time series, based on bicorrelations and cross-bicorrelations. It is shown that the forecasting technique arises as a natural extension of, and as a complement to, existing univariate and multivariate non-linearity tests. The formulations are essentially modified autoregressive or vector autoregressive models respectively, which can be estimated using ordinary least squares. The techniques are applied to a set of high-frequency exchange rate returns, and their out-of-sample forecasting performance is compared to that of other time series models
Resumo:
This review is an output of the International Life Sciences Institute (ILSI) Europe Marker Initiative, which aims to identify evidence-based criteria for selecting adequate measures of nutrient effects on health through comprehensive literature review. Experts in cognitive and nutrition sciences examined the applicability of these proposed criteria to the field of cognition with respect to the various cognitive domains usually assessed to reflect brain or neurological function. This review covers cognitive domains important in the assessment of neuronal integrity and function, commonly used tests and their state of validation, and the application of the measures to studies of nutrition and nutritional intervention trials. The aim is to identify domain-specific cognitive tests that are sensitive to nutrient interventions and from which guidance can be provided to aid the application of selection criteria for choosing the most suitable tests for proposed nutritional intervention studies using cognitive outcomes. The material in this review serves as a background and guidance document for nutritionists, neuropsychologists, psychiatrists, and neurologists interested in assessing mental health in terms of cognitive test performance and for scientists intending to test the effects of food or food components on cognitive function.
Resumo:
This paper presents some important issues on misidentification of human interlocutors in text-based communication during practical Turing tests. The study here presents transcripts in which human judges succumbed to theconfederate effect, misidentifying hidden human foils for machines. An attempt is made to assess the reasons for this. The practical Turing tests in question were held on 23 June 2012 at Bletchley Park, England. A selection of actual full transcripts from the tests is shown and an analysis is given in each case. As a result of these tests, conclusions are drawn with regard to the sort of strategies which can perhaps lead to erroneous conclusions when one is involved as an interrogator. Such results also serve to indicate conversational directions to avoid for those machine designers who wish to create a conversational entity that performs well on the Turing test.
Resumo:
Purpose The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations. Design/methodology/approach The study adopts an Investment Management System as its case and investigates different implementations of this system within eight financial organizations, predominantly focused on investment banking and asset management activities within capital markets. At the systems vendor site, senior systems consultants and client relationship managers were interviewed. Within the financial organizations, compliance, risk and systems experts were interviewed. Findings The study empirically tests modes of institutional change. Displacement and Layering were found to be the most prevalent modes. However, the study highlights how the outcomes of Displacement and Drift may be similar in effect as both modes may cause compliance gaps. The research highlights how changes in regulations may create gaps in systems and processes which, in the short term, need to be plugged by manual processes. Practical implications Vendors abilities to manage institutional change caused by Drift, Displacement, Layering and Conversion and their ability to efficiently and quickly translate institutional variables into structured systems has the power to ease the pain and cost of compliance as well as reducing the risk of breeches by reducing the need for interim manual systems. Originality/value The study makes a contribution by applying recent theoretical concepts of institutional change to the topic of regulatory change uses this analysis to provide insight into the effects of this new environment
Resumo:
This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.
Resumo:
The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow’s milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants.
Resumo:
More than 70 years ago it was recognised that ionospheric F2-layer critical frequencies [foF2] had a strong relationship to sunspot number. Using historic datasets from the Slough and Washington ionosondes, we evaluate the best statistical fits of foF2 to sunspot numbers (at each Universal Time [UT] separately) in order to search for drifts and abrupt changes in the fit residuals over Solar Cycles 17-21. This test is carried out for the original composite of the Wolf/Zürich/International sunspot number [R], the new “backbone” group sunspot number [RBB] and the proposed “corrected sunspot number” [RC]. Polynomial fits are made both with and without allowance for the white-light facular area, which has been reported as being associated with cycle-to-cycle changes in the sunspot number - foF2 relationship. Over the interval studied here, R, RBB, and RC largely differ in their allowance for the “Waldmeier discontinuity” around 1945 (the correction factor for which for R, RBB and RC is, respectively, zero, effectively over 20 %, and explicitly 11.6 %). It is shown that for Solar Cycles 18-21, all three sunspot data sequences perform well, but that the fit residuals are lowest and most uniform for RBB. We here use foF2 for those UTs for which R, RBB, and RC all give correlations exceeding 0.99 for intervals both before and after the Waldmeier discontinuity. The error introduced by the Waldmeier discontinuity causes R to underestimate the fitted values based on the foF2 data for 1932-1945 but RBB overestimates them by almost the same factor, implying that the correction for the Waldmeier discontinuity inherent in RBB is too large by a factor of two. Fit residuals are smallest and most uniform for RC and the ionospheric data support the optimum discontinuity multiplicative correction factor derived from the independent Royal Greenwich Observatory (RGO) sunspot group data for the same interval.
Resumo:
Background: The method of porosity analysis by water absorption has been carried out by the storage of the specimens in pure water, but it does not exclude the potential plasticising effect of the water generating unreal values of porosity. Objective: The present study evaluated the reliability of this method of porosity analysis in polymethylmethacrylate denture base resins by the determination of the most satisfactory solution for storage (S), where the plasticising effect was excluded. Materials and methods: Two specimen shapes (rectangular and maxillary denture base) and two denture base resins, water bath-polymerised (Classico) and microwave-polymerised (Acron MC) were used. Saturated anhydrous calcium chloride solutions (25%, 50%, 75%) and distilled water were used for specimen storage. Sorption isotherms were used to determine S. Porosity factor (PF) and diffusion coefficient (D) were calculated within S and for the groups stored in distilled water. anova and Tukey tests were performed to identify significant differences in PF results and Kruskal-Wallis test and Dunn multiple comparison post hoc test, for D results (alpha = 0.05). Results: For Acron MC denture base shape, FP results were 0.24% (S 50%) and 1.37% (distilled water); for rectangular shape FP was 0.35% (S 75%) and 0.19% (distilled water). For Classico denture base shape, FP results were 0.54% (S 75%) and 1.21% (distilled water); for rectangular shape FP was 0.7% (S 50%) and 1.32% (distilled water). FP results were similar in S and distilled water only for Acron MC rectangular shape (p > 0.05). D results in distilled water were statistically higher than S for all groups. Conclusions: The results of the study suggest that an adequate solution for storing specimens must be used to measure porosity by water absorption, based on excluding the plasticising effect.
Resumo:
The objective of this work was to study the theological and thermal properties of film forming solutions (FFS) based on blends of gelatin and poly(vinyl alcohol) (PVA). The effect of the PVA concentration and plasticizer presence on the flow behavior, and viscoelastic and thermal properties of FFS was studied by steady-shear flow and oscillatory experiments, and also, by microcalorimetry. The FB presented Newtonian behavior at 30 degrees C, and the viscosity was not affected neither by the PVA concentration nor by the plasticizer. All FFS presented a phase transition during tests applying temperature scanning. It was verified that the PVA affected the viscoelastic properties of FFS by dilution of gelatin. This behavior was confirmed by microcalorimetric analysis. The behaviors of the storage (G`) and loss (G ``) moduli as a function of frequency of FFS obtained at 5 degrees C were typical of physical gels; with the G` higher than the G ``. The strength of the gels was affected by the PVA concentration. (C) 2009 Elsevier Ltd. All rights reserved.