721 resultados para data dependence
Resumo:
We estimate the cost of droughts by matching rainfall data with individual life satisfaction. Our context is Australia over the period 2001 to 2004, which included a particularly severe drought. Using fixed-effect models, we find that a drought in spring has a detrimental effect on life satisfaction equivalent to an annual reduction in income of A$18,000. This effect, however, is only found for individuals living in rural areas. Using our estimates, we calculate that the predicted doubling of the frequency of spring droughts will lead to the equivalent loss in life satisfaction of just over 1% of GDP annually.
Resumo:
Patients with chest discomfort or other symptoms suggestive of acute coronary syndrome (ACS) are one of the most common categories seen in many Emergency Departments (EDs). While the recognition of patients at high-risk of ACS has improved steadily, identifying the majority of chest pain presentations who fall into the low-risk group remains a challenge. Research in this area needs to be transparent, robust, applicable to all hospitals from large tertiary centres to rural and remote sites, and to allow direct comparison between different studies with minimum patient spectrum bias. A standardised approach to the research framework using a common language for data definitions must be adopted to achieve this. The aim was to create a common framework for a standardised data definitions set that would allow maximum value when extrapolating research findings both within Australasian ED practice, and across similar populations worldwide. Therefore a comprehensive data definitions set for the investigation of non-traumatic chest pain patients with possible ACS was developed, specifically for use in the ED setting. This standardised data definitions set will facilitate ‘knowledge translation’ by allowing extrapolation of useful findings into the real-life practice of emergency medicine.
Resumo:
Seasonal patterns have been found in a remarkable range of health conditions, including birth defects, respiratory infections and cardiovascular disease. Accurately estimating the size and timing of seasonal peaks in disease incidence is an aid to understanding the causes and possibly to developing interventions. With global warming increasing the intensity of seasonal weather patterns around the world, a review of the methods for estimating seasonal effects on health is timely. This is the first book on statistical methods for seasonal data written for a health audience. It describes methods for a range of outcomes (including continuous, count and binomial data) and demonstrates appropriate techniques for summarising and modelling these data. It has a practical focus and uses interesting examples to motivate and illustrate the methods. The statistical procedures and example data sets are available in an R package called ‘season’. Adrian Barnett is a senior research fellow at Queensland University of Technology, Australia. Annette Dobson is a Professor of Biostatistics at The University of Queensland, Australia. Both are experienced medical statisticians with a commitment to statistical education and have previously collaborated in research in the methodological developments and applications of biostatistics, especially to time series data. Among other projects, they worked together on revising the well-known textbook "An Introduction to Generalized Linear Models," third edition, Chapman Hall/CRC, 2008. In their new book they share their knowledge of statistical methods for examining seasonal patterns in health.
Resumo:
Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients
Resumo:
he purpose of this study was to evaluate the comparative cost of treating alcohol dependence with either cognitive behavioral therapy (CBT) alone or CBT combined with naltrexone (CBT+naltrexone). Two hundred ninety-eight outpatients dependent on alcohol who were consecutively treated for alcohol dependence participated in this study. One hundred seven (36%) patients received adjunctive pharmacotherapy (CBT+naltrexone). The Drug Abuse Treatment Cost Analysis Program was used to estimate treatment costs. Adjunctive pharmacotherapy (CBT+naltrexone) introduced an additional treatment cost and was 54% more expensive than CBT alone. When treatment abstinence rates (36.1% CBT; 62.6% CBT+naltrexone) were applied to cost effectiveness ratios, CBT+naltrexone demonstrated an advantage over CBT alone. There were no differences between groups on a preference-based health measure (SF-6D). In this treatment center, to achieve 100 abstainers over a 12-week program, 280 patients require CBT compared with 160 CBT+naltrexone. The dominant choice was CBT+naltrexone based on modest economic advantages and significant efficiencies in the numbers needed to treat.
Resumo:
Alexithymia is characterised by deficits in emotional insight and self reflection, that impact on the efficacy of psychological treatments. Given the high prevalence of alexithymia in Alcohol Use Disorders, valid assessment tools are critical. The majority of research on the relationship between alexithymia and alcohol-dependence has employed the self-administered Toronto Alexithymia Scale (TAS-20). The Observer Alexithymia Scale (OAS) has also been recommended. The aim of the present study was to assess the validity and reliability of the OAS and the TAS-20 in an alcohol-dependent sample. Two hundred and ten alcohol-dependent participants in an outpatient Cognitive Behavioral Treatment program were administered the TAS-20 at assessment and upon treatment completion at 12 weeks. Clinical psychologists provided observer assessment data for a subsample of 159 patients. The findings confirmed acceptable internal consistency, test-retest reliability and scale homogeneity for both the OAS and TAS-20, except for the low internal consistency of the TAS-20 EOT scale. The TAS-20 was more strongly associated with alcohol problems than the OAS.
Resumo:
In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.
Resumo:
Boards of directors are thought to provide access to a wealth of knowledge and resources for the companies they serve, and are considered important to corporate governance. Under the Resource Based View (RBV) of the firm (Wernerfelt, 1984) boards are viewed as a strategic resource available to firms. As a consequence there has been a significant research effort aimed at establishing a link between board attributes and company performance. In this thesis I explore and extend the study of interlocking directorships (Mizruchi, 1996; Scott 1991a) by examining the links between directors’ opportunity networks and firm performance. Specifically, I use resource dependence theory (Pfeffer & Salancik, 1978) and social capital theory (Burt, 1980b; Coleman, 1988) as the basis for a new measure of a board’s opportunity network. I contend that both directors’ formal company ties and their social ties determine a director’s opportunity network through which they are able to access and mobilise resources for their firms. This approach is based on recent studies that suggest the measurement of interlocks at the director level, rather than at the firm level, may be a more reliable indicator of this phenomenon. This research uses publicly available data drawn from Australia’s top-105 listed companies and their directors in 1999. I employ Social Network Analysis (SNA) (Scott, 1991b) using the UCINET software to analyse the individual director’s formal and social networks. SNA is used to measure a the number of ties a director has to other directors in the top-105 company director network at both one and two degrees of separation, that is, direct ties and indirect (or ‘friend of a friend’) ties. These individual measures of director connectedness are aggregated to produce a board-level network metric for comparison with measures of a firm’s performance using multiple regression analysis. Performance is measured with accounting-based and market-based measures. Findings indicate that better-connected boards are associated with higher market-based company performance (measured by Tobin’s q). However, weaker and mostly unreliable associations were found for accounting-based performance measure ROA. Furthermore, formal (or corporate) network ties are a stronger predictor of market performance than total network ties (comprising social and corporate ties). Similarly, strong ties (connectedness at degree-1) are better predictors of performance than weak ties (connectedness at degree-2). My research makes four contributions to the literature on director interlocks. First, it extends a new way of measuring a board’s opportunity network based on the director rather than the company as the unit of interlock. Second, it establishes evidence of a relationship between market-based measures of firm performance and the connectedness of that firm’s board. Third, it establishes that director’s formal corporate ties matter more to market-based firm performance than their social ties. Fourth, it establishes that director’s strong direct ties are more important to market-based performance than weak ties. The thesis concludes with implications for research and practice, including a more speculative interpretation of these results. In particular, I raise the possibility of reverse causality – that is networked directors seek to join high-performing companies. Thus, the relationship may be a result of symbolic action by companies seeking to increase the legitimacy of their firms rather than a reflection of the social capital available to the companies. This is an important consideration worthy of future investigation.
Resumo:
The recently proposed data-driven background dataset refinement technique provides a means of selecting an informative background for support vector machine (SVM)-based speaker verification systems. This paper investigates the characteristics of the impostor examples in such highly-informative background datasets. Data-driven dataset refinement individually evaluates the suitability of candidate impostor examples for the SVM background prior to selecting the highest-ranking examples as a refined background dataset. Further, the characteristics of the refined dataset were analysed to investigate the desired traits of an informative SVM background. The most informative examples of the refined dataset were found to consist of large amounts of active speech and distinctive language characteristics. The data-driven refinement technique was shown to filter the set of candidate impostor examples to produce a more disperse representation of the impostor population in the SVM kernel space, thereby reducing the number of redundant and less-informative examples in the background dataset. Furthermore, data-driven refinement was shown to provide performance gains when applied to the difficult task of refining a small candidate dataset that was mis-matched to the evaluation conditions.
Resumo:
This study assesses the recently proposed data-driven background dataset refinement technique for speaker verification using alternate SVM feature sets to the GMM supervector features for which it was originally designed. The performance improvements brought about in each trialled SVM configuration demonstrate the versatility of background dataset refinement. This work also extends on the originally proposed technique to exploit support vector coefficients as an impostor suitability metric in the data-driven selection process. Using support vector coefficients improved the performance of the refined datasets in the evaluation of unseen data. Further, attempts are made to exploit the differences in impostor example suitability measures from varying features spaces to provide added robustness.
Resumo:
There is a notable shortage of empirical research directed at measuring the magnitude and direction of stress effects on performance in a controlled environment. One reason for this is the inherent difficulties in identifying and isolating direct performance measures for individuals. Additionally most traditional work environments contain a multitude of exogenous factors impacting individual performance, but controlling for all such factors is generally unfeasible (omitted variable bias). Moreover, instead of asking individuals about their self-reported stress levels we observe workers' behavior in situations that can be classified as stressful. For this reason we have stepped outside the traditional workplace in an attempt to gain greater controllability of these factors using the sports environment as our experimental space. We empirically investigate the relationship between stress and performance, in an extreme pressure situation (football penalty kicks) in a winner take all sporting environment (FIFA World Cup and UEFA European Cup competitions). Specifically, we examine all the penalty shootouts between 1976 and 2008 covering in total 16 events. The results indicate that extreme stressors can have a positive or negative impact on Individuals' performance. On the other hand, more commonly experienced stressors do not affect professionals' performances.
Resumo:
The technological environment in which Australian SMEs operate can be best described as dynamic and vital. The rate of technological change provides the SME owner/manager a complex and challenging operational context. Wireless applications are being developed that provide mobile devices with Internet content and e-business services. In Australia the adoption of e-commerce by large organisations has been relatively high, however the same cannot be said for SMEs where adoption has been slower than other developed countries. In contrast however mobile telephone adoption and diffusion is relatively high by SMEs. This exploratory study identifies attitudes, perceptions and issues for mobile data technologies by regional SME owner/managers across a range of industry sectors. The major issues include the sector the firm belongs to, the current adoption status of the firm, the level of mistrust of the IT industry, the cost of the technologies and the applications and attributes of the technologies.
Resumo:
The technological environment in which contemporary small and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides the SME a complex and challenging operational context. The primary aim of this research was to identify the needs of SMEs in regional areas for mobile data technologies (MDT). In this study a distinction was drawn between those respondents who were full-adopters of technology, those who were partial-adopters and those who were non-adopters and these three segments articulated different needs and requirements for MDT. Overall the needs of regional SMEs for MDT can be conceptualised into three areas where the technology will assist business practices, communication, e-commerce and security.
Resumo:
The seemingly exponential nature of technological change provides SMEs with a complex and challenging operational context. The development of infrastructures capable of supporting the wireless application protocol (WAP) and associated 'wireless' applications represents the latest generation of technological innovation with potential appeals to SMEs and end-users alike. This paper aims to understand the mobile data technology needs of SMEs in a regional setting. The research was especially concerned with perceived needs across three market segments : non-adopters, partial-adopters and full-adopters of new technology. The research was exploratory in nature as the phenomenon under scrutiny is relatively new and the uses unclear, thus focus groups were conducted with each of the segments. The paper provides insights for business, industry and academics.
Resumo:
The technological environment in which contemporary small- and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides for the SME a complex and challenging operational context. The primary aim of this research was to identify the needs of SMEs in regional areas for mobile data technologies (MDT). In this study a distinction was drawn between those respondents who were full-adopters of technology, those who were partial-adopters, and those who were non-adopters and these three segments articulated different needs and requirements for MDT. Overall, the needs of regional SMEs for MDT can be conceptualised into three areas where the technology will assist business practices; communication, e-commerce and security