970 resultados para extensive
Resumo:
With a fair share of the blame for the subprime crisis pointing to banks' extensive involvement in trading, this thesis examines three closely related issues. The first essay shows that regulatory capital arbitrage, insolvency risk, and non-interest income are all important motivations for banks to become involved in trading. The second essay support the widely held perception that trading activities such as off-balance sheet derivatives, securitization, and assets sales all are making banks more opaque. With banks' business model changing from ''originate and hold'' to ''originate, repackage, and sell'', the last essay show that trading channel exist and it has weakened the effectiveness of monetary policy transmission through banks' capital and lending channel.
Resumo:
Extensive international research points to an association between changed work arrangements, especially those commonly labelled as contingent work, with adverse occupational health and safety (OHS) outcomes. Research also indicates these work arrangements have weakened or bypassed existing OHS and workers’ compensation regulatory regimes. However, there has been little if any research into how OHS inspectors perceive these issues and how they address them during workplace visits or investigations. Between 2003 and 2007 research was undertaken that entailed detailed documentary and statistical analysis, extended interviews with 170 regulatory managers and inspectors, and observational data collected while accompanying inspectors on 118 ‘typical’ workplace visits. Key findings are that inspectors responsible for a range of industries see altered work arrangements as a serious challenge, especially labour hire (agency work) and subcontracting. Though the law imposes clear obligations, inspectors identified misunderstanding/blameshifting and poor compliance amongst parties to these arrangements. The complexity of these work arrangements also posed logistical challenges to inspectorates.
Resumo:
"Despite the 10 months of grieving for those lost in Queensland’s January floods, new evidence produced during the coronial inquest into the 22 deaths and three disappearances has revealed new shocks for the bereaved families. Brisbane Coroner’s Court yesterday was introduced to a series of high-tech Google Earth animations backed by funereal music, explaining the scope of the unfolding tragedy, which swept away husbands, wives, children and grandparents in less than three hours on the afternoon of January 10 this year. The court was also told of the extensive search for human remains, of 131 kilometres of creeks and rivers from Spring Bluff to Brisbane and hundreds of dams that were searched three times by police divers, 250 army personnel and 200 police."
Resumo:
Abstract: Australia’s ecosystems are the basis of our current and future prosperity, and our national well-being.A strong and sustainable Australian ecosystem science enterprise is vital for understanding and securing these ecosystems in the face of current and future challenges. This Plan defines the vision and key directions for a national ecosystem science capability that will enable Australia to understand and effectively manage its ecosystems for decades to come.The Plan’s underlying theme is that excellent science supports a range of activities, including public engagement, that enable us to understand and maintain healthy ecosystems.Those healthy ecosystems are the cornerstone of our social and economic well-being.The vision guiding the development of this Plan is that in 20 years’ time the status of Australian ecosystems and how they change will be widely reported and understood, and the prosperity and well-being they provide will be secure. To enable this, Australia’s national ecosystem science capability will be coordinated, collaborative and connected.The Plan is based on an extensive set of collaboratively generated proposals from national town hall meetings that also formthe basis for its implementation. Some directions within the Plan are for the Australian ecosystem science community itself to implement, others will involve the users of ecosystem science and the groups that fund ecosystem science.We identify six equal priority areas for action to achieve our vision: (i) delivering maximum impact for Australia: enhancing relationships between scientists and end-users; (ii) supporting long-termresearch; (iii) enabling ecosystem surveillance; (iv) making the most of data resources; (v) inspiring a generation: empowering the public with knowledge and opportunities; (vi) facilitating coordination, collaboration and leadership. This shared vision will enable us to consolidate our current successes, overcome remaining barriers and establish the foundations to ensure Australian ecosystem science delivers for the future needs of Australia..
Resumo:
In recent years, research aimed at identifying and relating the antecedents and consequences of diffusing organizational practices/ideas has turned its attention to debating the international adoption and implementation of the Anglo-American model of corporate governance, i.e., a shareholder-value-orientation (SVO). While financial economists characterize the adoption of an SVO as necessary and performance-enhancing, behavioral scientists have disputed such claims, invoking institutional contingencies in the appropriateness of an SVO. Our study seeks to provide some resolution to the debate by developing an overarching socio-political perspective that links the antecedents and consequences of the adoption of the contested practice of SVO. We test our framework using extensive longitudinal data from 1992-2006 from the largest listed corporations in the Netherlands, and we find a negative relationship between SVO adoption and subsequent firm performance, although this effect is attenuated when accompanied by greater SVO-alignment among major owners and a firm’s visible commitment to an SVO. This study extends prior research on the diffusion of contested organizational practices that has taken a socio-political perspective by offering an original contingency perspective that addresses how and why the misaligned preferences of corporate owners will affect (i) a company’s inclination to espouse an SVO, and (ii) the performance consequences of such misalignment.This study suggests when board members are considering the adoption of new ideas/practices (e.g., SVO), they should consider the contextual fitness of the idea/practice with the firm’s owners and their interests.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
This paper presents an algorithm for mining unordered embedded subtrees using the balanced-optimal-search canonical form (BOCF). A tree structure guided scheme based enumeration approach is defined using BOCF for systematically enumerating the valid subtrees only. Based on this canonical form and enumeration technique, the balanced optimal search embedded subtree mining algorithm (BEST) is introduced for mining embedded subtrees from a database of labelled rooted unordered trees. The extensive experiments on both synthetic and real datasets demonstrate the efficiency of BEST over the two state-of-the-art algorithms for mining embedded unordered subtrees, SLEUTH and U3.
Resumo:
Purpose: The purpose of this paper is to review, critique and develop a research agenda for the Elaboration Likelihood Model (ELM). The model was introduced by Petty and Cacioppo over three decades ago and has been modified, revised and extended. Given modern communication contexts, it is appropriate to question the model’s validity and relevance. Design/methodology/approach: The authors develop a conceptual approach, based on a fully comprehensive and extensive review and critique of ELM and its development since its inception. Findings: This paper focuses on major issues concerning the ELM. These include model assumptions and its descriptive nature; continuum questions, multi-channel processing and mediating variables before turning to the need to replicate the ELM and to offer recommendations for its future development. Research limitations/implications: This paper offers a series of questions in terms of research implications. These include whether ELM could or should be replicated, its extension, a greater conceptualization of argument quality, an explanation of movement along the continuum and between central and peripheral routes to persuasion, or to use new methodologies and technologies to help better understanding consume thinking and behaviour? All these relate to the current need to explore the relevance of ELM in a more modern context. Practical implications: It is time to question the validity and relevance of the ELM. The diversity of on- and off-line media options and the variants of consumer choice raise significant issues. Originality/value: While the ELM model continues to be widely cited and taught as one of the major cornerstones of persuasion, questions are raised concerning its relevance and validity in 21st century communication contexts.
Resumo:
Carbon nanoscrolls (CNSs) are one of the carbon-based nanomaterials similar to carbon nanotubes (CNTs) but are not widely studied in spite of their great potential applications. Their practical applications are hindered by the challenging fabrication of the CNSs. A physical approach has been proposed recently to fabricate the CNS by rolling up a monolayer graphene nanoribbon (GNR) around a CNT driven by the interaction energy between them. In this study, we perform extensive molecular dynamics (MD) simulations to investigate the various factors that impact the formation of the CNS from GNR. Our simulation results show that the formation of the CNS is sensitive to the length of the CNT and temperature. When the GNR is functionalized with hydrogen, the formation of the CNS is determined by the density and distribution of the hydrogen atoms. Graphyne, the allotrope of graphene, is inferior to graphene in the formation of the CNS due to the weaker bonds and the associated smaller atom density. The mechanism behind the rolling of GNR into CNS lies in the balance between the GNR–CNT van der Waals (vdW) interactions and the strain energy of GNR. The present work reveals new important insights and provides useful guidelines for the fabrication of the CNS.
Resumo:
Increasing train speeds is conceptually a simple and straight forward method to expand railway capacity, for example in comparison to other more extensive and elaborate alternatives. In this article an analytical capacity model has been investigated as a means of performing a sensitivity analysis of train speeds. The results of this sensitivity analysis can help improve the operation of this railway system and to help it cope with additional demands in the future. To test our approach a case study of the Rah Ahane Iran (RAI) national railway network has been selected. The absolute capacity levels for this railway network have been determined and the analysis shows that increasing trains speeds may not be entirely cost effective in all circumstances.
Resumo:
Many governments in western democracies conduct the work of leading their societies forward through policy generation and implementation. Despite government attempts at extensive negotiation, collaboration and debate, the general populace in these same countries frequently express feelings of disempowerment and undue pressure to be compliant, often leading to disengagement. Here we outline Plan B: a process for examining how policies that emerge from good intentions are frequently interpreted as burdensome or irrelevant by those on whom they have an impact. Using a case study of professional standards for teachers in Australia, we describe how we distilled Foucault’s notions of archaeology into a research approach centring on the creation of ‘polyhedrons of intelligibility’ as an alternative approach by which both policy makers and those affected by their policies may understand how their respective causes are supported and adversely affected.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
To enhance the efficiency of regression parameter estimation by modeling the correlation structure of correlated binary error terms in quantile regression with repeated measurements, we propose a Gaussian pseudolikelihood approach for estimating correlation parameters and selecting the most appropriate working correlation matrix simultaneously. The induced smoothing method is applied to estimate the covariance of the regression parameter estimates, which can bypass density estimation of the errors. Extensive numerical studies indicate that the proposed method performs well in selecting an accurate correlation structure and improving regression parameter estimation efficiency. The proposed method is further illustrated by analyzing a dental dataset.
Resumo:
Monitoring the environment with acoustic sensors is an effective method for understanding changes in ecosystems. Through extensive monitoring, large-scale, ecologically relevant, datasets can be produced that can inform environmental policy. The collection of acoustic sensor data is a solved problem; the current challenge is the management and analysis of raw audio data to produce useful datasets for ecologists. This paper presents the applied research we use to analyze big acoustic datasets. Its core contribution is the presentation of practical large-scale acoustic data analysis methodologies. We describe details of the data workflows we use to provide both citizen scientists and researchers practical access to large volumes of ecoacoustic data. Finally, we propose a work in progress large-scale architecture for analysis driven by a hybrid cloud-and-local production-grade website.
Resumo:
With the increase in complexity of engineering projects and design quality in the construction industry, the traditional two-dimensional "Information Island" approach to design is becoming less able to meet current design needs due to its lack of coordination and information sharing. Collaborative design using a Build Information Modeling (BIM) technology platform promises to provide an effective means of designing and communicating through networking and real-time data sharing. This paper first analyzes the shortcomings of the two-dimensional design process and the potential application of collaborative design. By combining the attributes of BIM, a preliminary BIM-based building design collaborative platform is developed to improve the design approach and support a more collaborative design process. A real-life case is presented to demonstrate the feasibility and validity of the platform and its use in practice. From this, it is shown that BIM has the potential to realize effective information sharing and reduce errors, thereby improving design quality. The BIM-based building design collaborative platform presented is expected to provide the support needed for the extensive application of BIM in collaborative design and promote a new attitude to project management.