905 resultados para bootstrap percolation
Resumo:
Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.
Resumo:
If, as is widely believed, schizophrenia is characterized by abnormalities of brain functional connectivity, then it seems reasonable to expect that different subtypes of schizophrenia could be discriminated in the same way. However, evidence for differences in functional connectivity between the subtypes of schizophrenia is largely lacking and, where it exists, it could be accounted for by clinical differences between the patients (e.g. medication) or by the limitations of the measures used. In this study, we measured EEG functional connectivity in unmedicated male patients diagnosed with either positive or negative syndrome schizophrenia and compared them with age and sex matched healthy controls. Using new methodology (Medkour et al., 2009) based on partial coherence, brain connectivity plots were constructed for positive and negative syndrome patients and controls. Reliable differences in the pattern of functional connectivity were found with both syndromes showing not only an absence of some of the connections that were seen in controls but also the presence of connections that the controls did not show. Comparing connectivity graphs using the Hamming distance, the negative-syndrome patients were found to be more distant from the controls than were the positive syndrome patients. Bootstrap distributions of these distances were created which showed a significant difference in the mean distances that was consistent with the observation that negative-syndrome diagnosis is associated with a more severe form of schizophrenia. We conclude that schizophrenia is characterized by widespread changes in functional connectivity with negative syndrome patients showing a more extreme pattern of abnormality than positive syndrome patients.
Resumo:
Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.
Resumo:
In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.
Resumo:
This paper proposes a semiparametric smooth-coefficient (SPSC) stochastic production frontier model where regression coefficients are unknown smooth functions of environmental factors (ZZ). Technical inefficiency is specified in the form of a parametric scaling function which also depends on the ZZ variables. Thus, in our SPSC model the ZZ variables affect productivity directly via the technology parameters as well as through inefficiency. A residual-based bootstrap test of the relevance of the environmental factors in the SPSC model is suggested. An empirical application is also used to illustrate the technique.
Resumo:
Zambia and many other countries in Sub-Saharan Africa face a key challenge of sustaining high levels of coverage of AIDS treatment under prospects of dwindling global resources for HIV/AIDS treatment. Policy debate in HIV/AIDS is increasingly paying more focus to efficiency in the use of available resources. In this chapter, we apply Data Envelopment Analysis (DEA) to estimate short term technical efficiency of 34 HIV/AIDS treatment facilities in Zambia. The data consists of input variables such as human resources, medical equipment, building space, drugs, medical supplies, and other materials used in providing HIV/AIDS treatment. Two main outputs namely, numbers of ART-years (Anti-Retroviral Therapy-years) and pre-ART-years are included in the model. Results show the mean technical efficiency score to be 83%, with great variability in efficiency scores across the facilities. Scale inefficiency is also shown to be significant. About half of the facilities were on the efficiency frontier. We also construct bootstrap confidence intervals around the efficiency scores.
Resumo:
We study phenomenological scaling theories of the polymer dynamics in random media, employing the existing scaling theories of polymer chains and the percolation statistics. We investigate both the Rouse and the Zimm model for Brownian dynamics and estimate the diffusion constant of the center-of-mass of the chain in such disordered media. For internal dynamics of the chain, we estimate the dynamic exponents. We propose similar scaling theory for the reptation dynamics of the chain in the framework of Flory theory for the disordered medium. The modifications in the case of correlated disorders are also discussed. .
Resumo:
The influence of IT investment on hospital efficiency and quality are of great interest to healthcare executives as well as insurers. Few studies have examined how IT investments influence both efficiency and quality or whether there is an optimal IT investment level that influences both in the desired direction. Decision makers in healthcare wonder if there are tradeoffs between their pursuit of hospital operational efficiency and quality. Our study involving a 2-stage double bootstrap DEA analysis of 187 US hospitals over 2. years found direct effects of IT investment upon service quality and a moderating effect of quality upon operational efficiency. Further, our findings indicate a U-shaped relationship between IT investments and operational efficiency suggesting that IT investments have diminishing returns beyond a certain point.
Resumo:
We are the first to examine the market reaction to 13 announcement dates related to IFRS 9 for over 5400 European listed firms. We find an overall positive reaction to the introduction of IFRS 9. The regulation is particularly beneficial to shareholders of firms in countries with weaker rule of law and a smaller divergence between local GAAP and IAS 39. Bootstrap simulations rule out the possibility that sampling error or data mining are driving our findings. Our main findings are also robust to confounding events and the extent of the media coverage for each event. These results suggest that investors perceive the new regulation as shareholder-wealth enhancing and support the view that stronger comparability across accounting standards of European firms is beneficial to international investors and outweighs the costs of poorer firm-specific information.
Resumo:
We report an empirical analysis of long-range dependence in the returns of eight stock market indices, using the Rescaled Range Analysis (RRA) to estimate the Hurst exponent. Monte Carlo and bootstrap simulations are used to construct critical values for the null hypothesis of no long-range dependence. The issue of disentangling short-range and long-range dependence is examined. Pre-filtering by fitting a (short-range) autoregressive model eliminates part of the long-range dependence when the latter is present, while failure to pre-filter leaves open the possibility of conflating short-range and long-range dependence. There is a strong evidence of long-range dependence for the small central European Czech stock market index PX-glob, and a weaker evidence for two smaller western European stock market indices, MSE (Spain) and SWX (Switzerland). There is little or no evidence of long-range dependence for the other five indices, including those with the largest capitalizations among those considered, DJIA (US) and FTSE350 (UK). These results are generally consistent with prior expectations concerning the relative efficiency of the stock markets examined. © 2011 Elsevier Inc.
Resumo:
* Работа выполнена при поддержке РФФИ, гранты 07-01-00331-a и 08-01-00944-a
Resumo:
Since the development of large scale power grid interconnections and power markets, research on available transfer capability (ATC) has attracted great attention. The challenges for accurate assessment of ATC originate from the numerous uncertainties in electricity generation, transmission, distribution and utilization sectors. Power system uncertainties can be mainly described as two types: randomness and fuzziness. However, the traditional transmission reliability margin (TRM) approach only considers randomness. Based on credibility theory, this paper firstly built models of generators, transmission lines and loads according to their features of both randomness and fuzziness. Then a random fuzzy simulation is applied, along with a novel method proposed for ATC assessment, in which both randomness and fuzziness are considered. The bootstrap method and multi-core parallel computing technique are introduced to enhance the processing speed. By implementing simulation for the IEEE-30-bus system and a real-life system located in Northwest China, the viability of the models and the proposed method is verified.
Resumo:
2000 Mathematics Subject Classification: 33C90, 62E99
Resumo:
2010 Mathematics Subject Classification: 62J99.
Resumo:
Performance analysis has become a vital part of the management practices in the banking industry. There are numerous applications using DEA models to estimate efficiency in banking, and most of them assume that inputs and outputs are known with absolute precision. Here, we propose new Fuzzy-DEA α-level models to assess underlying uncertainty. Further, bootstrap truncated regressions with fixed factors are used to measure the impact of each model on the efficiency scores and to identify the most relevant contextual variables on efficiency. The proposed models have been demonstrated using an application in Mozambican banks to handle the underlying uncertainty. Findings reveal that fuzziness is predominant over randomness in interpreting the results. In addition, fuzziness can be used by decision-makers to identify missing variables to help in interpreting the results. Price of labor, price of capital, and market-share were found to be the significant factors in measuring bank efficiency. Managerial implications are addressed.