625 resultados para Collection theory
Resumo:
Professional coaching is a rapidly expanding field with interdisciplinary roots and broad application. However, despite abundant prescriptive literature, research into the process of coaching, and especially life coaching, is minimal. Similarly, although learning is inherently recognised in the process of coaching, and coaching is increasingly being recognised as a means of enhancing teaching and learning, the process of learning in coaching is little understood, and learning theory makes up only a small part of the evidence-based coaching literature. In this grounded theory study of life coaches and their clients, the process of learning in life coaching across a range of coaching models is examined and explained. The findings demonstrate how learning in life coaching emerged as a process of discovering, applying and integrating self-knowledge, which culminated in the development of self. This process occurred through eight key coaching processes shared between coaches and clients and combined a multitude of learning theory.
Resumo:
This paper presents the stability analysis for a distribution static compensator (DSTATCOM) that operates in current control mode based on bifurcation theory. Bifurcations delimit the operating zones of nonlinear circuits and, hence, the capability to compute these bifurcations is of important interest for practical design. A control design for the DSTATCOM is proposed. Along with this control, a suitable mathematical representation of the DSTATCOM is proposed to carry out the bifurcation analysis efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the point of common coupling. In addition, the stability regions in the control gain space, as well as the contour lines for different Floquet multipliers are computed. It is demonstrated through bifurcation analysis that the loss of stability in the DSTATCOM is due to the emergence of a Neimark bifurcation. The observations are verified through simulation studies.
Resumo:
Internationally, collection of reliable data on new and evolving health-care roles is crucial. We describe a protocol for design and administration of a national census of an emergent health-care role, namely nurse practitioners in Australia using databases held by regulatory authorities. A questionnaire was developed to obtain data on the role and scope of practice of Australian nurse practitioners. Our tool comprised five sections and included a total of 56 questions, using 28 existing items from the National Nursing and Midwifery Labour Force Census and nine items recommended in the Nurse Practitioner Workforce Planning Minimum Data Set. Australian Nurse Registering Authorities (n = 6) distributed the survey on our behalf. This paper outlines our instrument and methods. The survey was administered to 238 authorized Australian nurse practitioners (85% response rate). Rigorous collection of standardized items will ensure health policy is informed by reliable and valid data. We will re-administer the survey 2 years following the first survey to measure change over time.
Resumo:
Focuses on a study which introduced an iterative modeling method that combines properties of ordinary least squares (OLS) with hierarchical tree-based regression (HTBR) in transportation engineering. Information on OLS and HTBR; Comparison and contrasts of OLS and HTBR; Conclusions.
Resumo:
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.
Resumo:
"Qld Business Leaders Hall of Fame" is a research project that includes interviews with eminent Qlders that produced oral history interviews and digital stories about their life/company's achievements. This model was able to test and evaluate the use of oral history and digital storytelling for learning and community heritage purposes. Interviewees include; Sir John and Valmai Pidgeon, Joseph Saragossi, Robert Bryan, Clem Jones, Jim Kennedy, Sr Angela Mary, Castelmaine Perkins, Burns and Philp, Qantas, Don Argus & Steve Irwin.
Resumo:
The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
This paper argues, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the theory of ‘basic democracy’ may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the theory of ‘basic democracy’ may meet these same parameters, it could settle the debate concerning the definition of democracy. This will be argued firstly by discussing what the theory of ‘basic democracy’ is and why it differs from previous work; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the theory of ‘basic democracy’ may match the parameters.