862 resultados para Worth


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines a preliminary review and the limited evidence of over-regulation in Australian financial services. The 1997 Wallis Report and the CLERP 6 paper resulted in the amendments to Ch 7 of the Corporations Act 2001 (Cth) by the Financial Services Reform Act. Nearly a decade later the system based upon 'one-size fits all' dual track regime and a consistent licensing regime has greatly increased the costs of compliance. In the area of enforcement there has not been a dramatic change to the effective techniques applied by ASIC over other agencies such as APRA. In particular there are clear economic arguments, as well as international experiences which state that a single financial services regulator is more effective than the multi-layered approach adopted in Australia. Finally, in the superannuation area of financial services, which is worth A$800 billion there is unnecessary dual licensing and duplicated regulation with little evidence of any consumer-member benefit but at a much greater cost

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I present an analysis of the language used by the National Endowment for Democracy (NED) on its website (NED, 2008). The specific focus of the analysis is on the NED's high usage of the word “should” revealed in computer assisted corpus analysis using Leximancer. Typically we use the word “should” as a term to propose specific courses of action for ourselves and others. It is a marker of obligation and “oughtness”. In other words, its systematic institutional use can be read as a statement of ethics, of how the NED thinks the world ought to behave. As an ostensibly democracy-promoting institution, and one with a clear agenda of implementing American foreign policy, the ethics of NED are worth understanding. Analysis reveals a pattern of grammatical metaphor in which “should” is often deployed counter intuitively, and sometimes ambiguously, as a truth-making tool rather than one for proposing action. The effect is to present NED's imperatives for action as matters of fact rather than ethical or obligatory claims.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review and discuss the literature on small firm growth with an intention to provide a useful vantage point for new research studies regarding this important phenomenon. We first discuss conceptual and methodological issues that represent critical choices for those who research growth and which make it challenging to compare results from previous studies. The substantial review of past research is organized into four sections representing two smaller and two larger literatures. The first of the latter focuses on internal and external drivers of small firm growth. Here we find that much has been learnt and that many valuable generalizations can be made. However, we also conclude that more research of the same kind is unlikely to yield much. While interactive and non-linear effects may be worth pursuing it is unlikely that any new and important growth drivers or strong, linear main effects would be found. The second large literature deals with organizational life-cycles or stages of development. While deservedly criticized for unwarranted determinism and weak empirics this type of approach addresses problems of high practical and also theoretical relevance, and should not be shunned by researchers. We argue that with a change in the fundamental assumptions and improved empirical design, research on the organizational and managerial consequences of growth is an important line of inquiry. With this, we overlap with one of the smaller literatures, namely studies focusing on the effects of growth. We argue that studies too often assume that growth equals success. We advocate instead the use of growth as an intermediary variable that influences more fundamental goals in ways that should be carefully examined rather than assumed. The second small literature distinguishes between different modes or forms of growth, including, e.g., organic vs. acquisition-based growth, and international expansion. We note that modes of growth is an important topic that has been under studied in the growth literature, whereas in other branches of research aspects of it may have been studied intensely, but not primarily from a growth perspective. In the final section we elaborate on ways forward for research on small firm growth. We point at rich opportunities for researchers who look beyond drivers of growth, where growth is viewed as a homogenous phenomenon assumed to unambiguously reflect success, and instead focus on growth as a process and a multi-dimensional phenomenon, as well as on how growth relates to more fundamental outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Art is most often at the margins of community life, seen as a distraction or entertainment only; an individual’s whim. It is generally seen as without a useful role to play in that community. This is a perception of grown-ups; children seem readily to accept an engagement with art making. Our research has shown that when an individual is drawn into a crafted art project where they have an actual involvement with the direction and production of the art work, then they become deeply engaged on multiple levels. This is true of all age groups. Artists skilled in community collaboration are able to produce art of value that transcends the usual judgements of worth. It gives people a licence to unfetter their imagination and then cooperatively be drawn back to a reachable visual solution. If you engage with children in a community, you engage the extended family at some point. The primary methodology was to produce a series of educationally valid projects at the Cherbourg State School that had a resonance into that community, then revisit and refine them where necessary and develop a new series that extended all of the positive aspects of them. This was done over a period of five years. The art made during this time is excellent. The children know it, as do their families, staff at the school, members of the local community and the others who have viewed it in exhibitions in far places like Brisbane and Melbourne. This art and the way it has been made has been acknowledged as useful by the children, teachers and the community, in educational and social terms. The school is a better place to be. This has been acknowledged by the children, teachers and the community The art making of the last five years has become an integral part of the way the school now operates and the influence of that has begun to seep into other parts of the community. Art needs to be taken from the margins and put to work at the centre.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the advanced North American environmental mitigation schemes for their applicability to Queensland. Compensatory wetland mitigation banking, in particular, is concerned with in-perpetuity management and protection - the basic concerns of the Queensland public about its unique environment. The process has actively engaged the North American market and become a thriving industry that (for the most part) effectively designs, creates and builds (or enhances) environmental habitat. A methodology was designed to undertake a comprehensive review of the history, evolution and concepts of the North American wetland mitigation banking system - before and after the implementation of a significant new compensatory wetland mitigation banking regulation in 2008. The Delphi technique was then used to determine the principles and working components of wetland mitigation banking. Results were then applied to formulate a questionnaire to review Australian marketbased instruments (including offsetting policies) against these North American principles. Following this, two case studies established guiding principles for implementation based on two components of the North American wetland mitigation banking program. The subsequent outcomes confirmed that environmental banking is a workable concept in North America and that it is worth applying in Queensland. The majority of offsetting policies in Australia have adopted some principles of the North American mitigation programs. Examination reveals that however, they fail to provide adequate incentives for private landowners to participate because the essential trading mechanisms are not employed. Much can thus be learnt from the North American situation - where private enterprise has devised appropriate free market concepts. The consequent environmental banking process (as adapted from the North American programs) should be implemented in Queensland. It can then focus here on engaging the private sector, where the majority of naturally productive lands are managed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The construction industry plays a substantial role in a country’s national economy, irrespective of the country’s levels of economic development. The Malaysian Government has given a much needed boost to the country’s construction projects under the 9th Malaysian Plan where a total of 880 projects worth RM15billion (US$48billion) is to be tendered (The Star, 2006). However, Malaysia has not escaped the problems of project failure. In 2005, 17.3% of 417 Malaysian government contracts projects were considered “sick”. Project procurement is one of the most important stages of project delivery. Even though ethics in project procurement has been identified as one of the contributors to project failure, it has not been systematically studied before from the perspective of client in Malaysia. The aim of this paper is to present an exploration to the ethical issues in project procurement in Malaysian public sector projects. By exploring ethical issues from client perspective, this could provide an ethical standpoint for the project life cycle and could maintain a good affiliation between the clients and the customers. It is expected that findings from this review will be somewhat representative of other developing countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Principal Topic Venture ideas are at the heart of entrepreneurship (Davidsson, 2004). However, we are yet to learn what factors drive entrepreneurs’ perceptions of the attractiveness of venture ideas, and what the relative importance of these factors are for their decision to pursue an idea. The expected financial gain is one factor that will obviously influence the perceived attractiveness of a venture idea (Shepherd & DeTienne, 2005). In addition, the degree of novelty of venture ideas along one or more dimensions such as new products/services, new method of production, enter into new markets/customer and new method of promotion may affect their attractiveness (Schumpeter, 1934). Further, according to the notion of an individual-opportunity nexus venture ideas are closely associated with certain individual characteristics (relatedness). Shane (2000) empirically identified that individual’s prior knowledge is closely associated with the recognition of venture ideas. Sarasvathy’s (2001; 2008) Effectuation theory proposes a high degree of relatedness between venture ideas and the resource position of the individual. This study examines how entrepreneurs weigh considerations of different forms of novelty and relatedness as well as potential financial gain in assessing the attractiveness of venture ideas. Method I use conjoint analysis to determine how expert entrepreneurs develop preferences for venture ideas which involved with different degrees of novelty, relatedness and potential gain. The conjoint analysis estimates respondents’ preferences in terms of utilities (or part-worth) for each level of novelty, relatedness and potential gain of venture ideas. A sample of 32 expert entrepreneurs who were awarded young entrepreneurship awards were selected for the study. Each respondent was interviewed providing with 32 scenarios which explicate different combinations of possible profiles open them into consideration. Results and Implications Results indicate that while the respondents do not prefer mere imitation they receive higher utility for low to medium degree of newness suggesting that high degrees of newness are fraught with greater risk and/or greater resource needs. Respondents pay considerable weight on alignment with the knowledge and skills they already posses in choosing particular venture idea. The initial resource position of entrepreneurs is not equally important. Even though expected potential financial gain gives substantial utility, result indicate that it is not a dominant factor for the attractiveness of venture idea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With regard to the long-standing problem of the semantic gap between low-level image features and high-level human knowledge, the image retrieval community has recently shifted its emphasis from low-level features analysis to high-level image semantics extrac- tion. User studies reveal that users tend to seek information using high-level semantics. Therefore, image semantics extraction is of great importance to content-based image retrieval because it allows the users to freely express what images they want. Semantic content annotation is the basis for semantic content retrieval. The aim of image anno- tation is to automatically obtain keywords that can be used to represent the content of images. The major research challenges in image semantic annotation are: what is the basic unit of semantic representation? how can the semantic unit be linked to high-level image knowledge? how can the contextual information be stored and utilized for image annotation? In this thesis, the Semantic Web technology (i.e. ontology) is introduced to the image semantic annotation problem. Semantic Web, the next generation web, aims at mak- ing the content of whatever type of media not only understandable to humans but also to machines. Due to the large amounts of multimedia data prevalent on the Web, re- searchers and industries are beginning to pay more attention to the Multimedia Semantic Web. The Semantic Web technology provides a new opportunity for multimedia-based applications, but the research in this area is still in its infancy. Whether ontology can be used to improve image annotation and how to best use ontology in semantic repre- sentation and extraction is still a worth-while investigation. This thesis deals with the problem of image semantic annotation using ontology and machine learning techniques in four phases as below. 1) Salient object extraction. A salient object servers as the basic unit in image semantic extraction as it captures the common visual property of the objects. Image segmen- tation is often used as the �rst step for detecting salient objects, but most segmenta- tion algorithms often fail to generate meaningful regions due to over-segmentation and under-segmentation. We develop a new salient object detection algorithm by combining multiple homogeneity criteria in a region merging framework. 2) Ontology construction. Since real-world objects tend to exist in a context within their environment, contextual information has been increasingly used for improving object recognition. In the ontology construction phase, visual-contextual ontologies are built from a large set of fully segmented and annotated images. The ontologies are composed of several types of concepts (i.e. mid-level and high-level concepts), and domain contextual knowledge. The visual-contextual ontologies stand as a user-friendly interface between low-level features and high-level concepts. 3) Image objects annotation. In this phase, each object is labelled with a mid-level concept in ontologies. First, a set of candidate labels are obtained by training Support Vectors Machines with features extracted from salient objects. After that, contextual knowledge contained in ontologies is used to obtain the �nal labels by removing the ambiguity concepts. 4) Scene semantic annotation. The scene semantic extraction phase is to get the scene type by using both mid-level concepts and domain contextual knowledge in ontologies. Domain contextual knowledge is used to create scene con�guration that describes which objects co-exist with which scene type more frequently. The scene con�guration is represented in a probabilistic graph model, and probabilistic inference is employed to calculate the scene type given an annotated image. To evaluate the proposed methods, a series of experiments have been conducted in a large set of fully annotated outdoor scene images. These include a subset of the Corel database, a subset of the LabelMe dataset, the evaluation dataset of localized semantics in images, the spatial context evaluation dataset, and the segmented and annotated IAPR TC-12 benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The modern society has come to expect the electrical energy on demand, while many of the facilities in power systems are aging beyond repair and maintenance. The risk of failure is increasing with the aging equipments and can pose serious consequences for continuity of electricity supply. As the equipments used in high voltage power networks are very expensive, economically it may not be feasible to purchase and store spares in a warehouse for extended periods of time. On the other hand, there is normally a significant time before receiving equipment once it is ordered. This situation has created a considerable interest in the evaluation and application of probability methods for aging plant and provisions of spares in bulk supply networks, and can be of particular importance for substations. Quantitative adequacy assessment of substation and sub-transmission power systems is generally done using a contingency enumeration approach which includes the evaluation of contingencies, classification of the contingencies based on selected failure criteria. The problem is very complex because of the need to include detailed modelling and operation of substation and sub-transmission equipment using network flow evaluation and to consider multiple levels of component failures. In this thesis a new model associated with aging equipment is developed to combine the standard tools of random failures, as well as specific model for aging failures. This technique is applied in this thesis to include and examine the impact of aging equipments on system reliability of bulk supply loads and consumers in distribution network for defined range of planning years. The power system risk indices depend on many factors such as the actual physical network configuration and operation, aging conditions of the equipment, and the relevant constraints. The impact and importance of equipment reliability on power system risk indices in a network with aging facilities contains valuable information for utilities to better understand network performance and the weak links in the system. In this thesis, algorithms are developed to measure the contribution of individual equipment to the power system risk indices, as part of the novel risk analysis tool. A new cost worth approach was developed in this thesis that can make an early decision in planning for replacement activities concerning non-repairable aging components, in order to maintain a system reliability performance which economically is acceptable. The concepts, techniques and procedures developed in this thesis are illustrated numerically using published test systems. It is believed that the methods and approaches presented, substantially improve the accuracy of risk predictions by explicit consideration of the effect of equipment entering a period of increased risk of a non-repairable failure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aggressive driving is increasingly a concern for drivers in highly motorised countries. However, the role of driver intent in this behaviour is problematic and there is little research on driver cognitions in relation to aggressive driving incidents. In addition, while drivers who admit to behaving aggressively on the road also frequently report being recipients of similar behaviours, little is known about the relationship between perpetration and victimisation or about how road incidents escalate into the more serious events that feature in capture media attention. The current study used qualitative interviews to explore driver cognitions and underlying motivations for aggressive behaviours on the road. A total of 30 drivers aged 18-49 years were interviewed about their experiences with aggressive driving. A key theme identified in responses was driver aggression as an attempt to manage or modify the behaviour of other road users. Two subthemes were identified and appeared related to separate motivations for aggressive responses: ‘teaching them a lesson’ referred to situations where respondents intended to convey criticism or disapproval, usually of unintended behaviours by the other driver, and thus encourage self-correction; and ‘justified retaliation’ which referred to situations where respondents perceived deliberate intent on the part of the other driver and responded aggressively in return. Mildly aggressive driver behaviour appears to be common. Moreover such behaviour has a sufficiently negative impact on other drivers that it may be worth addressing because of its potential for triggering retaliation in kind or escalation of aggression, thus compromising safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The editor, Gerard de Valence, points out in the preface, this book is neither a textbook nor a guide to what is done by construction managers and construction economists – read quantity surveyors and the like. Rather, de Valence notes it comprises a collection of chapters each of which focus on matters at the industry level and, in doing so, illustrates that a substantially improved understanding of the building and construction industry can be gained beyond the economics of delivering projects. Before giving some thought to how far each of the chapters achieve this, it’s worth reflecting on the virtues of developing construction economics as its own discipline or sub-discipline in general economics and the bold manner by which de Valence is proposing we do this. That is, de Valence proposes partitioning industry and project economics - as explained in the preface and in Chapter 1. de Valence’s view that “the time seems right” for these developments is also worthy of some consideration.