928 resultados para Multivariate GARCH models
Resumo:
In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. A project funded by the Australian Learning and Teaching Council fills this gap. The project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. Guided by these findings interviews with 36 LIS educators explored the current approaches used within contemporary LIS education to prepare graduates to become “librarian 2.0”. This video presents an example of ‘great practice’ in current LIS education as it strives to foster web 2.0 professionals.
Resumo:
This ALTC Teaching Fellowship aimed to establish Guiding Principles for Library and Information Science Education 2.0. The aim was achieved by (i) identifying the current and anticipated skills and knowledge required by successful library and information science (LIS) professionals in the age of web 2.0 (and beyond), (ii) establishing the current state of LIS education in Australia in supporting the development of librarian 2.0, and in doing so, identify models of best practice.
The fellowship has contributed to curriculum renewal in the LIS profession. It has helped to ensure that LIS education in Australia continues to meet the changing skills and knowledge requirements of the profession it supports. It has also provided a vehicle through which LIS professionals and LIS educators may find opportunities for greater collaboration and more open communication. This will help bridge the gap between LIS theory and practice and will foster more authentic engagement between LIS education and other parts of the LIS industry in the education of the next generation of professionals. Through this fellowship the LIS discipline has become a role model for other disciplines who will be facing similar issues in the coming years.
Eighty-one members of the Australian LIS profession participated in a series of focus groups exploring the current and anticipated skills and knowledge needed by the LIS professional in the web 2.0 world and beyond. Whilst each focus group tended to draw on specific themes of interest to that particular group of people, there was a great deal of common ground. Eight key themes emerged: technology, learning and education, research or evidence-based practice, communication, collaboration and team work, user focus, business savvy and personal traits.
It was acknowledged that the need for successful LIS professionals to possess transferable skills and interpersonal attributes was not new. It was noted however that the speed with which things are changing in the web 2.0 world was having a significant impact and that this faster pace is placing a new and unexpected emphasis on the transferable skills and knowledge. It was also acknowledged that all librarians need to possess these skills, knowledge and attributes and not just the one or two role models who lead the way.
The most interesting finding however was that web 2.0, library 2.0 and librarian 2.0 represented a ‘watershed’ for the LIS profession. Almost all the focus groups spoke about how they are seeing and experiencing a culture change in the profession. Librarian 2.0 requires a ‘different mindset or attitude’. The Levels of Perspective model by Daniel Kim provides one lens by which to view this finding. The focus group findings suggest that we are witnessing a re-awaking of the Australian LIS profession as it begins to move towards the higher levels of Kim’s model (ie mental models, vision).
Thirty-six LIS educators participated in telephone interviews aimed at exploring the current state of LIS education in supporting the development of librarian 2.0. Skills and knowledge of LIS professionals in a web 2.0 world that were identified and discussed by the LIS educators mirrored those highlighted in the focus group discussions with LIS professionals. Similarly it was noted that librarian 2.0 needed a focus less on skills and knowledge and more on attitude. However, whilst LIS professionals felt that there was a paradigm shift within the profession. LIS educators did not speak with one voice on this matter with quite a number of the educators suggesting that this might be ‘overstating it a bit’. This study provides evidence for “disparate viewpoints” (Hallam, 2007) between LIS educators and LIS professionals that can have a significant implications for the future of not just LIS professional education specifically but for the profession generally.
Library and information science education 2.0: guiding principles and models of best practice 1
Inviting the LIS academics to discuss how their teaching and learning activities support the development of librarian 2.0 was a core part of the interviews conducted. The strategies used and the challenges faced by LIS educators in developing their teaching and learning approaches to support the formation of librarian 2.0 are identified and discussed. A core part of the fellowship was the identification of best practice examples on how LIS educators were developing librarian 2.0. Twelve best practice examples were identified. Each educator was recorded discussing his or her approach to teaching and learning. Videos of these interviews are available via the Fellowship blog at
Resumo:
This paper proposes the use of eigenvoice modeling techniques with the Cross Likelihood Ratio (CLR) as a criterion for speaker clustering within a speaker diarization system. The CLR has previously been shown to be a robust decision criterion for speaker clustering using Gaussian Mixture Models. Recently, eigenvoice modeling techniques have become increasingly popular, due to its ability to adequately represent a speaker based on sparse training data, as well as an improved capture of differences in speaker characteristics. This paper hence proposes that it would be beneficial to capitalize on the advantages of eigenvoice modeling in a CLR framework. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 35.1% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.
Resumo:
Small animal fracture models have gained increasing interest in fracture healing studies. To achieve standardized and defined study conditions, various variables must be carefully controlled when designing fracture healing experiments in mice or rats. The strain, age and sex of the animals may influence the process of fracture healing. Furthermore, the choice of the fracture fixation technique depends on the questions addressed, whereby intra- and extramedullary implants as well as open and closed surgical approaches may be considered. During the last few years, a variety of different, highly sophisticated implants for fracture fixation in small animals have been developed. Rigid fixation with locking plates or external fixators results in predominantly intramembranous healing in both mice and rats. Locking plates, external fixators, intramedullary screws, the locking nail and the pin-clip device allow different degrees of stability resulting in various amounts of endochondral and intramembranous healing. The use of common pins that do not provide rotational and axial stability during fracture stabilization should be discouraged in the future. Analyses should include at least biomechanical and histological evaluations, even if the focus of the study is directed towards the elucidation of molecular mechanisms of fracture healing using the largely available spectrum of antibodies and gene-targeted animals to study molecular mechanisms of fracture healing. This review discusses distinct requirements for the experimental setups as well as the advantages and pitfalls of the different fixation techniques in rats and mice.
Resumo:
Polynomial models are shown to simulate accurately the quadratic and cubic nonlinear interactions (e.g. higher-order spectra) of time series of voltages measured in Chua's circuit. For circuit parameters resulting in a spiral attractor, bispectra and trispectra of the polynomial model are similar to those from the measured time series, suggesting that the individual interactions between triads and quartets of Fourier components that govern the process dynamics are modeled accurately. For parameters that produce the double-scroll attractor, both measured and modeled time series have small bispectra, but nonzero trispectra, consistent with higher-than-second order nonlinearities dominating the chaos.
Resumo:
Overall, computer models and simulations have a rather disappointing record within the management sciences as a tool for predicting the future. Social and market environments can be influenced by an overwhelming number of variables, and it is therefore difficult to use computer models to make forecasts or to test hypotheses concerning the relationship between individual behaviours and macroscopic outcomes. At the same time, however, advocates of computer models argue that they can be used to overcome the human mind's inability to cope with several complex variables simultaneously or to understand concepts that are highly counterintuitive. This paper seeks to bridge the gap between these two perspectives by suggesting that management research can indeed benefit from computer models by using them to formulate fruitful hypotheses.
Resumo:
We consider a robust filtering problem for uncertain discrete-time, homogeneous, first-order, finite-state hidden Markov models (HMMs). The class of uncertain HMMs considered is described by a conditional relative entropy constraint on measures perturbed from a nominal regular conditional probability distribution given the previous posterior state distribution and the latest measurement. Under this class of perturbations, a robust infinite horizon filtering problem is first formulated as a constrained optimization problem before being transformed via variational results into an unconstrained optimization problem; the latter can be elegantly solved using a risk-sensitive information-state based filtering.
Resumo:
A time series method for the determination of combustion chamber resonant frequencies is outlined. This technique employs the use of Markov-chain Monte Carlo (MCMC) to infer parameters in a chosen model of the data. The development of the model is included and the resonant frequency is characterised as a function of time. Potential applications for cycle-by-cycle analysis are discussed and the bulk temperature of the gas and the trapped mass in the combustion chamber are evaluated as a function of time from resonant frequency information.
Resumo:
Emergency Health Services (EHS), encompassing hospital-based Emergency Departments (ED) and pre-hospital ambulance services, are a significant and high profile component of Australia’s health care system and congestion of these, evidenced by physical overcrowding and prolonged waiting times, is causing considerable community and professional concern. This concern relates not only to Australia’s capacity to manage daily health emergencies but also the ability to respond to major incidents and disasters. EHS congestion is a result of the combined effects of increased demand for emergency care, increased complexity of acute health care, and blocked access to ongoing care (e.g. inpatient beds). Despite this conceptual understanding there is a lack of robust evidence to explain the factors driving increased demand, or how demand contributes to congestion, and therefore public policy responses have relied upon limited or unsound information. The Emergency Health Services Queensland (EHSQ) research program proposes to determine the factors influencing the growing demand for emergency health care and to establish options for alternative service provision that may safely meet patient’s needs. The EHSQ study is funded by the Australian Research Council (ARC) through its Linkage Program and is supported financially by the Queensland Ambulance Service (QAS). This monograph is part of a suite of publications based on the research findings that examines the existing literature, and current operational context. Literature was sourced using standard search approaches and a range of databases as well as a selection of articles cited in the reviewed literature. Public sources including the Australian Institute of Health and Welfare (AIHW), the Council of Ambulance Authorities (CAA) Annual Reports, Australian Bureau of Statistics (ABS) and Department of Health and Ageing (DoHA) were examined for trend data across Australia.
A longitudinal study of corporate earnings guidance in Australia’s continuous disclosure environment
Resumo:
Since the introduction of a statutory‐backed continuous disclosure regime (CDR) in 1994, regulatory reforms have significantly increased litigation risk in Australia for failure to disclose material information or for false and misleading disclosure. However, there is almost no empirical research on the impact of the reforms on corporate disclosure behaviour. Motivated by the absence of research and using management earnings forecasts (MEFs) as a disclosure proxy, this study examines (1) why managers issue earnings forecasts, (2) what firm‐specific factors influence MEF characteristics, and (3) how MEF behaviour changes as litigation risk increases. Based on theories in information economics, a theoretical framework for MEF behaviour is formulated which includes antecedent influencing factors related to firms‟ internal and external environments. Applying this framework, hypotheses are developed and tested using multivariate models and a large sample of hand-collected MEFs (7,213) issued by top 500 ASX-listed companies over the 1994 to 2008 period. The results reveal strong support for the hypotheses. First, MEFs are issued to reduce information asymmetry, litigation risk and signal superior performance. Second, firms with better financial performance, smaller earnings changes, and lower operating uncertainty provide better quality MEFs. Third, forecast frequency and quality (accuracy, timeliness and precision) noticeably improve as litigation risk increases. However, managers appear to be still reluctant to disclose earnings forecasts when there are large earnings changes, and an asymmetric treatment of news type continues to prevail (a good news bias). Thus, the findings generally provide support for the effectiveness of the CDR regulatory reforms in improving disclosure behaviour and will be valuable to market participants and corporate regulators in understanding the implications of management forecasting decisions and areas for further improvement.
Resumo:
The objective of this thesis is to investigate whether the corporate governance practices adopted by Chinese listed firms are associated with the quality of earnings information. Based on a review of agency and institutional theory, this study develops hypotheses that predict the monitoring effectiveness of the board and the audit committee. Using a combination of univariate and multivariate analyses, the association between corporate governance mechanisms and earnings management are tested from 2004 to 2008. Through analysing the empirical results, a number of findings are summarised as below. First, board independence is weakened by the introduction of government officials as independent directors on the boards. Government officials acting as independent directors, claim that they meet the definition of independent director set by the regulation. However, they have some connection with the State, which is the controlling shareholder in listed SOEs affiliated companies. Consequently, the effect of the independent director’s expertise in constraining earnings management is mitigated as demonstrated by an insignificant association between board expertise and earnings management. An alternative explanation for the inefficiency of board independence may point to the pre-selection of independent directors by the powerful CEO. It is argued that a CEO can manipulate the board composition and choose the "desirable" independent directors to monitor themselves. Second, a number of internal mechanisms, such as board size, board activities, and the separation of the roles of the CEO and chair are found to be significantly associated with discretionary accruals. This result suggests that there are advantages in having a large and active board in the Chinese setting. This can offset the disadvantages associated with large boards, such as increased bureaucracy, and hence, increase the constraining effects of a large and resourceful board. Third, factor analysis identifies two factors: CEO power and board power. CEO power is the factor which consists of CEO duality and turnover, and board power is composed of board size and board activity. The results of CEO power show that if a Chinese listed company has CEO duality and turnover at the same time, it is more likely to have a high level of earnings management. The significant and negative relationship between board power and accruals indicate that large boards with frequent meetings can be associated with low level of earnings management. Overall, the factor analysis suggests that certain governance mechanisms complement each other to become more efficient monitors of opportunistic earnings management. A combination of board characteristics can increase the negative association with earnings management. Fourth, the insignificant results between audit committees and earnings management in Chinese listed firms suggests that the Chinese regulator should strengthen the audit committee functions. This thesis calls for listed firms to disclose more information on audit committee composition and activities, which can facilitate future research on the Chinese audit committee’s monitoring role. Fifth, the interactive results between State ownership and board characteristics show that dominant State ownership has a moderating effect on board monitoring power as the State totally controls 42% of the issued shares. The high percentage of State ownership makes it difficult for the non-controlling institutional shareholders to challenge the State’s dominant status. As a result, the association between non-controlling institutional ownership and earnings management is insignificant in most situations. Lastly, firms audited by the international Big4 have lower abnormal accruals than firms audited by domestic Chinese audit firms. In addition, the inverse U-shape relationship between audit tenure and earnings quality demonstrates the changing effects of audit quality after a certain period of appointment. Furthermore, this thesis finds that listing in Hong Kong Stock Exchanges can be an alternative governance mechanism to discipline Chinese firms to follow strict Hong Kong listing requirements. Management of Hong Kong listed companies are exposed to the scrutiny of international investors and Hong Kong regulators. This in turn reduces their chances of conducting self-interested earnings manipulation. This study is designed to fill the gap in governance literature in China that is related to earnings management. Previous research on corporate governance mechanisms and earnings management in China is not conclusive. The current research builds on previous literature and provides some meaningful implications for practitioners, regulators, academic, and international investors who have investment interests in a transitional country. The findings of this study contribute to corporate governance and earnings management literature in the context of the transitional economy of China. The use of alternative measures for earnings management yields similar results compared with the accruals models and produces additional findings.
Resumo:
Language Modeling (LM) has been successfully applied to Information Retrieval (IR). However, most of the existing LM approaches only rely on term occurrences in documents, queries and document collections. In traditional unigram based models, terms (or words) are usually considered to be independent. In some recent studies, dependence models have been proposed to incorporate term relationships into LM, so that links can be created between words in the same sentence, and term relationships (e.g. synonymy) can be used to expand the document model. In this study, we further extend this family of dependence models in the following two ways: (1) Term relationships are used to expand query model instead of document model, so that query expansion process can be naturally implemented; (2) We exploit more sophisticated inferential relationships extracted with Information Flow (IF). Information flow relationships are not simply pairwise term relationships as those used in previous studies, but are between a set of terms and another term. They allow for context-dependent query expansion. Our experiments conducted on TREC collections show that we can obtain large and significant improvements with our approach. This study shows that LM is an appropriate framework to implement effective query expansion.
Resumo:
PURPOSE: To examine the visual predictors of falls and injurious falls among older adults with glaucoma. METHODS: Prospective falls data were collected for 71 community-dwelling adults with primary open-angle glaucoma, mean age 73.9 ± 5.7 years, for one year using monthly falls diaries. Baseline assessment of central visual function included high-contrast visual acuity and Pelli-Robson contrast sensitivity. Binocular integrated visual fields were derived from monocular Humphrey Field Analyser plots. Rate ratios (RR) for falls and injurious falls with 95% confidence intervals (CIs) were based on negative binomial regression models. RESULTS: During the one year follow-up, 31 (44%) participants experienced at least one fall and 22 (31%) experienced falls that resulted in an injury. Greater visual impairment was associated with increased falls rate, independent of age and gender. In a multivariate model, more extensive field loss in the inferior region was associated with higher rate of falls (RR 1.57, 95%CI 1.06, 2.32) and falls with injury (RR 1.80, 95%CI 1.12, 2.98), adjusted for all other vision measures and potential confounding factors. Visual acuity, contrast sensitivity, and superior field loss were not associated with the rate of falls; topical beta-blocker use was also not associated with increased falls risk. CONCLUSIONS: Falls are common among older adults with glaucoma and occur more frequently in those with greater visual impairment, particularly in the inferior field region. This finding highlights the importance of the inferior visual field region in falls risk and assists in identifying older adults with glaucoma at risk of future falls, for whom potential interventions should be targeted. KEY WORDS: glaucoma, visual field, visual impairment, falls, injury
Resumo:
Photochemistry has made significant contributions to our understanding of many important natural processes as well as the scientific discoveries of the man-made world. The measurements from such studies are often complex and may require advanced data interpretation with the use of multivariate or chemometrics methods. In general, such methods have been applied successfully for data display, classification, multivariate curve resolution and prediction in analytical chemistry, environmental chemistry, engineering, medical research and industry. However, in photochemistry, by comparison, applications of such multivariate approaches were found to be less frequent although a variety of methods have been used, especially with spectroscopic photochemical applications. The methods include Principal Component Analysis (PCA; data display), Partial Least Squares (PLS; prediction), Artificial Neural Networks (ANN; prediction) and several models for multivariate curve resolution related to Parallel Factor Analysis (PARAFAC; decomposition of complex responses). Applications of such methods are discussed in this overview and typical examples include photodegradation of herbicides, prediction of antibiotics in human fluids (fluorescence spectroscopy), non-destructive in- and on-line monitoring (near infrared spectroscopy) and fast-time resolution of spectroscopic signals from photochemical reactions. It is also quite clear from the literature that the scope of spectroscopic photochemistry was enhanced by the application of chemometrics. To highlight and encourage further applications of chemometrics in photochemistry, several additional chemometrics approaches are discussed using data collected by the authors. The use of a PCA biplot is illustrated with an analysis of a matrix containing data on the performance of photocatalysts developed for water splitting and hydrogen production. In addition, the applications of the Multi-Criteria Decision Making (MCDM) ranking methods and Fuzzy Clustering are demonstrated with an analysis of water quality data matrix. Other examples of topics include the application of simultaneous kinetic spectroscopic methods for prediction of pesticides, and the use of response fingerprinting approach for classification of medicinal preparations. In general, the overview endeavours to emphasise the advantages of chemometrics' interpretation of multivariate photochemical data, and an Appendix of references and summaries of common and less usual chemometrics methods noted in this work, is provided. Crown Copyright © 2010.