992 resultados para HA Statistics


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Undoubtedly, statistics has become one of the most important subjects in the modern world, where its applications are ubiquitous. The importance of statistics is not limited to statisticians, but also impacts upon non-statisticians who have to use statistics within their own disciplines. Several studies have indicated that most of the academic departments around the world have realized the importance of statistics to non-specialist students. Therefore, the number of students enrolled in statistics courses has vastly increased, coming from a variety of disciplines. Consequently, research within the scope of statistics education has been able to develop throughout the last few years. One important issue is how statistics is best taught to, and learned by, non-specialist students. This issue is controlled by several factors that affect the learning and teaching of statistics to non-specialist students, such as the use of technology, the role of the English language (especially for those whose first language is not English), the effectiveness of statistics teachers and their approach towards teaching statistics courses, students’ motivation to learn statistics and the relevance of statistics courses to the main subjects of non-specialist students. Several studies, focused on aspects of learning and teaching statistics, have been conducted in different countries around the world, particularly in Western countries. Conversely, the situation in Arab countries, especially in Saudi Arabia, is different; here, there is very little research in this scope, and what there is does not meet the needs of those countries towards the development of learning and teaching statistics to non-specialist students. This research was instituted in order to develop the field of statistics education. The purpose of this mixed methods study was to generate new insights into this subject by investigating how statistics courses are currently taught to non-specialist students in Saudi universities. Hence, this study will contribute towards filling the knowledge gap that exists in Saudi Arabia. This study used multiple data collection approaches, including questionnaire surveys from 1053 non-specialist students who had completed at least one statistics course in different colleges of the universities in Saudi Arabia. These surveys were followed up with qualitative data collected via semi-structured interviews with 16 teachers of statistics from colleges within all six universities where statistics is taught to non-specialist students in Saudi Arabia’s Eastern Region. The data from questionnaires included several types, so different techniques were used in analysis. Descriptive statistics were used to identify the demographic characteristics of the participants. The chi-square test was used to determine associations between variables. Based on the main issues that are raised from literature review, the questions (items scales) were grouped and five key groups of questions were obtained which are: 1) Effectiveness of Teachers; 2) English Language; 3) Relevance of Course; 4) Student Engagement; 5) Using Technology. Exploratory data analysis was used to explore these issues in more detail. Furthermore, with the existence of clustering in the data (students within departments within colleges, within universities), multilevel generalized linear models for dichotomous analysis have been used to clarify the effects of clustering at those levels. Factor analysis was conducted confirming the dimension reduction of variables (items scales). The data from teachers’ interviews were analysed on an individual basis. The responses were assigned to one of the eight themes that emerged from within the data: 1) the lack of students’ motivation to learn statistics; 2) students' participation; 3) students’ assessment; 4) the effective use of technology; 5) the level of previous mathematical and statistical skills of non-specialist students; 6) the English language ability of non-specialist students; 7) the need for extra time for teaching and learning statistics; and 8) the role of administrators. All the data from students and teachers indicated that the situation of learning and teaching statistics to non-specialist students in Saudi universities needs to be improved in order to meet the needs of those students. The findings of this study suggested a weakness in the use of statistical software applications in these courses. This study showed that there is lack of application of technology such as statistical software programs in these courses, which would allow non-specialist students to consolidate their knowledge. The results also indicated that English language is considered one of the main challenges in learning and teaching statistics, particularly in institutions where English is not used as the main language. Moreover, the weakness of mathematical skills of students is considered another major challenge. Additionally, the results indicated that there was a need to tailor statistics courses to the needs of non-specialist students based on their main subjects. The findings indicate that statistics teachers need to choose appropriate methods when teaching statistics courses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A birth-death process is subject to mass annihilation at rate β with subsequent mass immigration occurring into state j at rateα j . This structure enables the process to jump from one sector of state space to another one (via state 0) with transition rate independent of population size. First, we highlight the difficulties encountered when using standard techniques to construct both time-dependent and equilibrium probabilities. Then we show how to overcome such analytic difficulties by means of a tool developed in Chen and Renshaw (1990, 1993b); this approach is applicable to many processes whose underlying generator on E\{0} has known probability structure. Here we demonstrate the technique through application to the linear birth-death generator on which is superimposed an annihilation/immigration process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Attention has recently focussed on stochastic population processes that can undergo total annihilation followed by immigration into state j at rate αj. The investigation of such models, called Markov branching processes with instantaneous immigration (MBPII), involves the study of existence and recurrence properties. However, results developed to date are generally opaque, and so the primary motivation of this paper is to construct conditions that are far easier to apply in practice. These turn out to be identical to the conditions for positive recurrence, which are very easy to check. We obtain, as a consequence, the surprising result that any MBPII that exists is ergodic, and so must possess an equilibrium distribution. These results are then extended to more general MBPII, and we show how to construct the associated equilibrium distributions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this paper is to investigate the p-ίh moment asymptotic stability decay rates for certain finite-dimensional Itό stochastic differential equations. Motivated by some practical examples, the point of our analysis is a special consideration of general decay speeds, which contain as a special case the usual exponential or polynomial type one, to meet various situations. Sufficient conditions for stochastic differential equations (with variable delays or not) are obtained to ensure their asymptotic properties. Several examples are studied to illustrate our theory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A class of generalized Lévy Laplacians which contain as a special case the ordinary Lévy Laplacian are considered. Topics such as limit average of the second order functional derivative with respect to a certain equally dense (uniformly bounded) orthonormal base, the relations with Kuo’s Fourier transform and other infinite dimensional Laplacians are studied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new structure with the special property that an instantaneous reflection barrier is imposed on the ordinary birth-death processes is considered. An easy-checking criterion for the existence of such Markov processes is first obtained. The uniqueness criterion is then established. In the nonunique case, all the honest processes are explicitly constructed. Ergodicity properties for these processes are investigated. It is proved that honest processes are always ergodic without necessarily imposing any extra conditions. Equilibrium distributions for all these ergodic processes are established. Several examples are provided to illustrate our results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Review of 'Super Crunchers: How Anything Can Be Predicted' by Ian Ayres, published by John Murray, 2007 (ISBN 0-719-564638)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper uses a case study approach to consider the effectiveness of the electronic survey as a research tool to measure the learner voice about experiences of e-learning in a particular institutional case. Two large scale electronic surveys were carried out for the Student Experience of e-Learning (SEEL) project at the University of Greenwich in 2007 and 2008, funded by the UK Higher Education Academy (HEA). The paper considers this case to argue that, although the electronic web-based survey is a convenient method of quantitative and qualitative data collection, enabling higher education institutions swiftly to capture multiple views of large numbers of students regarding experiences of e-learning, for more robust analysis, electronic survey research is best combined with other methods of in-depth qualitative data collection. The advantages and disadvantages of the electronic survey as a research method to capture student experiences of e-learning are the focus of analysis in this short paper, which reports an overview of large-scale data collection (1,000+ responses) from two electronic surveys administered to students using surveymonkey as a web-based survey tool as part of the SEEL research project. Advantages of web-based electronic survey design include flexibility, ease of design, high degree of designer control, convenience, low costs, data security, ease of access and guarantee of confidentiality combined with researcher ability to identify users through email addresses. Disadvantages of electronic survey design include the self-selecting nature of web-enabled respondent participation, which tends to skew data collection towards students who respond effectively to email invitations. The relative inadequacy of electronic surveys to capture in-depth qualitative views of students is discussed with regard to prior recommendations from the JISC-funded Learners' Experiences of e-Learning (LEX) project, in consideration of the results from SEEL in-depth interviews with students. The paper considers the literature on web-based and email electronic survey design, summing up the relative advantages and disadvantages of electronic surveys as a tool for student experience of e-learning research. The paper concludes with a range of recommendations for designing future electronic surveys to capture the learner voice on e-learning, contributing to evidence-based learning technology research development in higher education.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The study of myofiber reorganization in the remote zone after myocardial infarction has been performed in 2D. Microstructural reorganization in remodeled hearts, however, can only be fully appreciated by considering myofibers as continuous 3D entities. The aim of this study was therefore to develop a technique for quantitative 3D diffusion CMR tractography of the heart, and to apply this method to quantify fiber architecture in the remote zone of remodeled hearts. Methods: Diffusion Tensor CMR of normal human, sheep, and rat hearts, as well as infarcted sheep hearts was performed ex vivo. Fiber tracts were generated with a fourth-order Runge-Kutta integration technique and classified statistically by the median, mean, maximum, or minimum helix angle (HA) along the tract. An index of tract coherence was derived from the relationship between these HA statistics. Histological validation was performed using phase-contrast microscopy. Results: In normal hearts, the subendocardial and subepicardial myofibers had a positive and negative HA, respectively, forming a symmetric distribution around the midmyocardium. However, in the remote zone of the infarcted hearts, a significant positive shift in HA was observed. The ratio between negative and positive HA variance was reduced from 0.96 +/- 0.16 in normal hearts to 0.22 +/- 0.08 in the remote zone of the remodeled hearts (p<0.05). This was confirmed histologically by the reduction of HA in the subepicardium from -52.03 degrees +/- 2.94 degrees in normal hearts to -37.48 degrees +/- 4.05 degrees in the remote zone of the remodeled hearts (p < 0.05). Conclusions: A significant reorganization of the 3D fiber continuum is observed in the remote zone of remodeled hearts. The positive (rightward) shift in HA in the remote zone is greatest in the subepicardium, but involves all layers of the myocardium. Tractography-based quantification, performed here for the first time in remodeled hearts, may provide a framework for assessing regional changes in the left ventricle following infarction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed timevarying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible realtime term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis contains three main chapters on macro finance, with a focus on the term structure of interest rates and the applications of state-of-the-art Bayesian econometrics. Except for Chapter 1 and Chapter 5, which set out the general introduction and conclusion, each of the chapters can be considered as a standalone piece of work. In Chapter 2, we model and predict the term structure of US interest rates in a data rich environment. We allow the model dimension and parameters to change over time, accounting for model uncertainty and sudden structural changes. The proposed time-varying parameter Nelson-Siegel Dynamic Model Averaging (DMA) predicts yields better than standard benchmarks. DMA performs better since it incorporates more macro-finance information during recessions. The proposed method allows us to estimate plausible real-time term premia, whose countercyclicality weakened during the financial crisis. Chapter 3 investigates global term structure dynamics using a Bayesian hierarchical factor model augmented with macroeconomic fundamentals. More than half of the variation in the bond yields of seven advanced economies is due to global co-movement. Our results suggest that global inflation is the most important factor among global macro fundamentals. Non-fundamental factors are essential in driving global co-movements, and are closely related to sentiment and economic uncertainty. Lastly, we analyze asymmetric spillovers in global bond markets connected to diverging monetary policies. Chapter 4 proposes a no-arbitrage framework of term structure modeling with learning and model uncertainty. The representative agent considers parameter instability, as well as the uncertainty in learning speed and model restrictions. The empirical evidence shows that apart from observational variance, parameter instability is the dominant source of predictive variance when compared with uncertainty in learning speed or model restrictions. When accounting for ambiguity aversion, the out-of-sample predictability of excess returns implied by the learning model can be translated into significant and consistent economic gains over the Expectations Hypothesis benchmark.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Critical infrastructures are based on complex systems that provide vital services to the nation. The complexities of the interconnected networks, each managed by individual organisations, if not properly secured, could offer vulnerabilities that threaten other organisations’ systems that depend on their services. This thesis argues that the awareness of interdependencies among critical sectors needs to be increased. Managing and securing critical infrastructure is not isolated responsibility of a government or an individual organisation. There is a need for a strong collaboration among critical service providers of public and private organisations in protecting critical information infrastructure. Cyber exercises have been incorporated in national cyber security strategies as part of critical information infrastructure protection. However, organising a cyber exercise involved multi sectors is challenging due to the diversity of participants’ background, working environments and incidents response policies. How well the lessons learned from the cyber exercise and how it can be transferred to the participating organisations is still a looming question. In order to understand the implications of cyber exercises on what participants have learnt and how it benefits participants’ organisation, a Cyber Exercise Post Assessment (CEPA) framework was proposed in this research. The CEPA framework consists of two parts. The first part aims to investigate the lessons learnt by participants from a cyber exercise using the four levels of the Kirkpatrick Training Model to identify their perceptions on reaction, learning, behaviour and results of the exercise. The second part investigates the Organisation Cyber Resilience (OCR) of participating sectors. The framework was used to study the impact of the cyber exercise called X Maya in Malaysia. Data collected through interviews with X Maya 5 participants were coded and categorised based on four levels according to the Kirkpatrick Training Model, while online surveys distributed to ten Critical National Information Infrastructure (CNII) sectors participated in the exercise. The survey used the C-Suite Executive Checklist developed by World Economic Forum in 2012. To ensure the suitability of the tool used to investigate the OCR, a reliability test conducted on the survey items showed high internal consistency results. Finally, individual OCR scores were used to develop the OCR Maturity Model to provide the organisation cyber resilience perspectives of the ten CNII sectors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this thesis is to review and augment the theory and methods of optimal experimental design. In Chapter I the scene is set by considering the possible aims of an experimenter prior to an experiment, the statistical methods one might use to achieve those aims and how experimental design might aid this procedure. It is indicated that, given a criterion for design, a priori optimal design will only be possible in certain instances and, otherwise, some form of sequential procedure would seem to be indicated. In Chapter 2 an exact experimental design problem is formulated mathematically and is compared with its continuous analogue. Motivation is provided for the solution of this continuous problem, and the remainder of the chapter concerns this problem. A necessary and sufficient condition for optimality of a design measure is given. Problems which might arise in testing this condition are discussed, in particular with respect to possible non-differentiability of the criterion function at the design being tested. Several examples are given of optimal designs which may be found analytically and which illustrate the points discussed earlier in the chapter. In Chapter 3 numerical methods of solution of the continuous optimal design problem are reviewed. A new algorithm is presented with illustrations of how it should be used in practice. It is shown that, for reasonably large sample size, continuously optimal designs may be approximated to well by an exact design. In situations where this is not satisfactory algorithms for improvement of this design are reviewed. Chapter 4 consists of a discussion of sequentially designed experiments, with regard to both the philosophies underlying, and the application of the methods of, statistical inference. In Chapter 5 we criticise constructively previous suggestions for fully sequential design procedures. Alternative suggestions are made along with conjectures as to how these might improve performance. Chapter 6 presents a simulation study, the aim of which is to investigate the conjectures of Chapter 5. The results of this study provide empirical support for these conjectures. In Chapter 7 examples are analysed. These suggest aids to sequential experimentation by means of reduction of the dimension of the design space and the possibility of experimenting semi-sequentially. Further examples are considered which stress the importance of the use of prior information in situations of this type. Finally we consider the design of experiments when semi-sequential experimentation is mandatory because of the necessity of taking batches of observations at the same time. In Chapter 8 we look at some of the assumptions which have been made and indicate what may go wrong where these assumptions no longer hold.