956 resultados para Modern portfolio theory
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
Cultural theory breaks with Modern analysis by rejecting traditional notions of race, gender, class and sexuality. In doing so, alternative frameworks such as Post-Feminism emerge which are useful for thinking about culture, technology and what our interactions with it mean. From a Post-Feminist perspective it can be seen how in our multi-cultural, post-industrial, digitized world, there is space to move beyond traditional ways of dividing up society such as ‘male’ and ‘female’. We are then free to re-construct our identity in light of a rich diversity of individually relevant experiences. Therefore, in order to get a better understanding of the highly nuanced cultural interactions that characterize our use of technology, this paper argues against using the inherently stereotyped lens of gender and allowing a new set of user needs to emerge.
Resumo:
This paper argues, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the theory of ‘basic democracy’ may be true by way of comparing it to Einstein’s Special Relativity – specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the theory of ‘basic democracy’ may meet these same parameters, it could settle the debate concerning the definition of democracy. This will be argued firstly by discussing what the theory of ‘basic democracy’ is and why it differs from previous work; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the theory of ‘basic democracy’ may match the parameters.
Resumo:
The dominant economic paradigm currently guiding industry policy making in Australia and much of the rest of the world is the neoclassical approach. Although neoclassical theories acknowledge that growth is driven by innovation, such innovation is exogenous to their standard models and hence often not explored. Instead the focus is on the allocation of scarce resources, where innovation is perceived as an external shock to the system. Indeed, analysis of innovation is largely undertaken by other disciplines, such as evolutionary economics and institutional economics. As more has become known about innovation processes, linear models, based on research and development or market demand, have been replaced by more complex interactive models which emphasise the existence of feedback loops between the actors and activities involved in the commercialisation of ideas (Manley 2003). Currently dominant among these approaches is the national or sectoral innovation system model (Breschi and Malerba 2000; Nelson 1993), which is based on the notion of increasingly open innovation systems (Chesbrough, Vanhaverbeke, and West 2008). This chapter reports on the ‘BRITE Survey’ funded by the Cooperative Research Centre for Construction Innovation which investigated the open sectoral innovation system operating in the Australian construction industry. The BRITE Survey was undertaken in 2004 and it is the largest construction innovation survey ever conducted in Australia. The results reported here give an indication of how construction innovation processes operate, as an example that should be of interest to international audiences interested in construction economics. The questionnaire was based on a broad range of indicators recommended in the OECD’s Community Innovation Survey guidelines (OECD/Eurostat 2005). Although the ABS has recently begun to undertake regular innovation surveys that include the construction industry (2006), they employ a very narrow definition of the industry and only collect very basic data compared to that provided by the BRITE Survey, which is presented in this chapter. The term ‘innovation’ is defined here as a new or significantly improved technology or organisational practice, based broadly on OECD definitions (OECD/Eurostat 2005). Innovation may be technological or organisational in nature and it may be new to the world, or just new to the industry or the business concerned. The definition thus includes the simple adoption of existing technological and organisational advancements. The survey collected information about respondents’ perceptions of innovation determinants in the industry, comprising various aspects of business strategy and business environment. It builds on a pilot innovation survey undertaken by PricewaterhouseCoopers (PWC) for the Australian Construction Industry Forum on behalf of the Australian Commonwealth Department of Industry Tourism and Resources, in 2001 (PWC 2002). The survey responds to an identified need within the Australian construction industry to have accurate and timely innovation data upon which to base effective management strategies and public policies (Focus Group 2004).
Resumo:
We present a novel modified theory based upon Rayleigh scattering of ultrasound from composite nanoparticles with a liquid core and solid shell. We derive closed form solutions to the scattering cross-section and have applied this model to an ultrasound contrast agent consisting of a liquid-filled core (perfluorooctyl bromide, PFOB) encapsulated by a polymer shell (poly-caprolactone, PCL). Sensitivity analysis was performed to predict the dependence of the scattering cross-section upon material and dimensional parameters. A rapid increase in the scattering cross-section was achieved by increasing the compressibility of the core, validating the incorporation of high compressibility PFOB; the compressibility of the shell had little impact on the overall scattering cross-section although a more compressible shell is desirable. Changes in the density of the shell and the core result in predicted local minima in the scattering cross-section, approximately corresponding to the PFOB-PCL contrast agent considered; hence, incorporation of a lower shell density could potentially significantly improve the scattering cross-section. A 50% reduction in shell thickness relative to external radius increased the predicted scattering cross-section by 50%. Although it has often been considered that the shell has a negative effect on the echogeneity due to its low compressibility, we have shown that it can potentially play an important role in the echogeneity of the contrast agent. The challenge for the future is to identify suitable shell and core materials that meet the predicted characteristics in order to achieve optimal echogenity.
Resumo:
The demand for high quality rail services in the twenty-first century has put an ever increasing demand on all rail operators. In order to meet the expectation of their patrons, the maintenance regime of railway systems has to be tightened up, the track conditions have to be well looked after, the rolling stock must be designed to withstand heavy duty. In short, in an ideal world where resources are unlimited, one needs to implement a very rigorous inspection regime in order to take care of the modem needs of a railway system [1]. If cost were not an issue, the maintenance engineers could inspect the train body by the most up-to-date techniques such as ultra-sound examination, x-ray inspection, magnetic particle inspection, etc. on a regular basis. However it is inconceivable to have such a perfect maintenance regime in any commercial railway. Likewise, it is impossible to have a perfect rolling stock which can weather all the heavy duties experienced in a modem railway. Hence it is essential that some condition monitoring schemes are devised to pick up potential defects which could manifest into safety hazards. This paper introduces an innovative condition monitoring system for track profile and, together with an instrumented car to carry out surveillance of the track, will provide a comprehensive railway condition monitoring system which is free from the usual difficulty of electromagnetic compatibility issues in a typical railway environment
Resumo:
Through a case study analysis, this paper discusses the essential elements of successful university-industry partnerships in the context of the integration of the scholarships of teaching, research and application. This scholarly integration is advocated as the modern paradigm of real-world laboratory activity termed the “living laboratory”. The paper further examines the application of the concepts of experimentation, engagement and regeneration as critical measures for evaluating successful university-industry partnerships. University-industry partnerships play an increasingly important role in the current climate of universities being held increasingly accountable for the benefits of their scholarship to be transferred to the wider community and to demonstrate measurable impacts.
Resumo:
The selection of projects and programs of work is a key function of both public and private sector organisations. Ideally, projects and programs that are selected to be undertaken are consistent with strategic objectives for the organisation; will provide value for money and return on investment; will be adequately resourced and prioritised; will not compete with general operations for resources and not restrict the ability of operations to provide income to the organisation; will match the capacity and capability of the organisation to deliver; and will produce outputs that are willingly accepted by end users and customers. Unfortunately,this is not always the case. Possible inhibitors to optimal project portfolio selection include: processes that are inconsistent with the needs of the organisation; reluctance to use an approach that may not produce predetermined preferences; loss of control and perceived decision making power; reliance on quantitative methods rather than qualitative methods for justification; ineffective project and program sponsorship; unclear project governance, processes and linkage to business strategies; ignorance, taboos and perceived effectiveness; inadequate education and training about the processes and their importance.
Resumo:
Plenary Session: "New Voices in Children's Literature"
Resumo:
This study reports on the impact of a "drink driving education program" taught to grade ten high school students. The program which involves twelve lessons uses strategies based on the Ajzen and Madden theory of planned behavior. Students were trained to use alternatives to drink driving and passenger behaviors. One thousand seven hundred and seventy-four students who had been taught the program in randomly assigned control and intervention schools were followed up three years later. There had been a major reduction in drink driving behaviors in both intervention and control students. In addition to this cohort change there was a trend toward reduced drink driving in the intervention group and a significant reduction in passenger behavior in this group. Readiness to use alternatives suggested that the major impact of the program was on students who were experimenting with the behavior at the time the program was taught. The program seems to have optimized concurrent social attitude and behavior change.
Resumo:
The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.
Resumo:
A consistent finding in the literature is that males report greater usage of drugs and subsequently greater amounts of drug driving. Research also suggests that vicarious influences may be more pertinent to males than to females. Utilising Stafford and Warr’s (1993) reconceptualization of deterrence theory, this study sought to determine if the relative deterrent impact of zero-tolerance drug driving laws is disparate between genders. A sample of motorists’ (N = 899) completed a self-report questionnaire assessing participants frequency of drug driving and personal and vicarious experiences with punishment and punishment avoidance. Results show that males were significantly more likely to report future intentions of drug driving. Additionally, vicarious experiences of punishment avoidance was a more influential predictor of future drug driving instances for males with personal experiences of punishment avoidance a more influential predictor for females. These findings can inform gender sensitive media campaigns and interventions for convicted drug drivers.
Resumo:
The changes of economic status in Malaysia have lead to many psychosocial problems especially among the young people. Counselling and psychotherapy have been seen as one of the solutions that are practiced in Western Culture. Most counselling theorists believe that their theory is universal however there is limited research to prove it. This paper will describe an ongoing study conducted in Malaysia about the applicability of one Western counselling Theory, Bowen’s family theory the Differentiation of self levels in the family allow a person to both leave the family’s boundaries in search of uniqueness and continually return to the family in order to further establish a sense of belonging. In addition Bowen believed that this comprised of four measures: Differentiation of Self (DSI), Family Inventory of Live Event (ILE), Depression Anxiety and Stress Scale (DASS) and Connor-Davidson Resilience Scale (CD-RISC). Preliminary findings are discussed and the implication in enhancing the quality of teaching family counselling in universities explored.