478 resultados para Alternative Dispute Resolution


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unresolved painful emotional experiences such as bereavement, trauma and disturbances in core relationships, are common presenting problems for clients of psychodrama or psychotherapy more generally. Emotional pain is experienced as a shattering of the sense of self and disconnection from others and, when unresolved, produces avoidant responses which inhibit the healing process. There is agreement across therapeutic modalities that exposure to emotional experience can increase the efficacy of therapeutic interventions. Moreno proposes that the activation of spontaneity is the primary curative factor in psychodrama and that healing occurs when the protagonist (client) engages with his or her wider social system and develops greater flexibility in response to that system. An extensive case-report literature describes the application of the psychodrama method in healing unresolved painful emotional experiences, but there is limited empirical research to verify the efficacy of the method or to identify the processes that are linked to therapeutic change. The purpose of this current research was to construct a model of protagonist change processes that could extend psychodrama theory, inform practitioners’ therapeutic decisions and contribute to understanding the common factors in therapeutic change. Four studies investigated protagonist processes linked to in-session resolution of painful emotional experiences. Significant therapeutic events were analysed using recordings and transcripts of psychodrama enactments, protagonist and director recall interviews and a range of process and outcome measures. A preliminary study (3 cases) identified four themes that were associated with helpful therapeutic events: enactment, the working alliance with the director and with group members, emotional release or relief and social atom repair. The second study (7 cases) used Comprehensive Process Analysis (CPA) to construct a model of protagonists’ processes linked to in-session resolution. This model was then validated across four more cases in Study 3. Five meta-processes were identified: (i) a readiness to engage in the psychodrama process; (ii) re-experiencing and insight; (iii) activating resourcefulness; (iv) social atom repair with emotional release and (v) integration. Social atom repair with emotional release involved deeply experiencing a wished-for interpersonal experience accompanied by a free flowing release of previously restricted emotion and was most clearly linked to protagonists’ reports of reaching resolution and to post session improvements in interpersonal relationships and sense of self. Acceptance of self in the moment increased protagonists’ capacity to generate new responses within each meta-process and, in resolved cases, there was evidence of spontaneity developing over time. The fourth study tested Greenberg’s allowing and accepting painful emotional experience model as an alternative explanation of protagonist change. The findings of this study suggested that while the process of allowing emotional pain was present in resolved cases, Greenberg’s model was not sufficient to explain the processes that lead to in-session resolution. The protagonist’s readiness to engage and activation of resourcefulness appear to facilitate the transition from problem identification to emotional release. Furthermore, experiencing a reparative relationship was found to be central to the healing process. This research verifies that there can be in-session resolution of painful emotional experience during psychodrama and protagonists’ reports suggest that in-session resolution can heal the damage to the sense of self and the interpersonal disconnection that are associated with unresolved emotional pain. A model of protagonist change processes has been constructed that challenges the view of psychodrama as a primarily cathartic therapy, by locating the therapeutic experience of emotional release within the development of new role relationships. The five meta-processes which are described within the model suggest broad change principles which can assist practitioners to make sense of events as they unfold and guide their clinical decision making in the moment. Each meta-process was linked to specific post-session changes, so that the model can inform the development of therapeutic plans for individual clients and can aid communication for practitioners when a psychodrama intervention is used for a specific therapeutic purpose within a comprehensive program of therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Road features extraction from remote sensed imagery has been a long-term topic of great interest within the photogrammetry and remote sensing communities for over three decades. The majority of the early work only focused on linear feature detection approaches, with restrictive assumption on image resolution and road appearance. The widely available of high resolution digital aerial images makes it possible to extract sub-road features, e.g. road pavement markings. In this paper, we will focus on the automatic extraction of road lane markings, which are required by various lane-based vehicle applications, such as, autonomous vehicle navigation, and lane departure warning. The proposed approach consists of three phases: i) road centerline extraction from low resolution image, ii) road surface detection in the original image, and iii) pavement marking extraction on the generated road surface. The proposed method was tested on the aerial imagery dataset of the Bruce Highway, Queensland, and the results demonstrate the efficiency of our approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing resolution of remote sensing images, road network can be displayed as continuous and homogeneity regions with a certain width rather than traditional thin lines. Therefore, road network extraction from large scale images refers to reliable road surface detection instead of road line extraction. In this paper, a novel automatic road network detection approach based on the combination of homogram segmentation and mathematical morphology is proposed, which includes three main steps: (i) the image is classified based on homogram segmentation to roughly identify the road network regions; (ii) the morphological opening and closing is employed to fill tiny holes and filter out small road branches; and (iii) the extracted road surface is further thinned by a thinning approach, pruned by a proposed method and finally simplified with Douglas-Peucker algorithm. Lastly, the results from some QuickBird images and aerial photos demonstrate the correctness and efficiency of the proposed process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate road lane information is crucial for advanced vehicle navigation and safety applications. With the increasing of very high resolution (VHR) imagery of astonishing quality provided by digital airborne sources, it will greatly facilitate the data acquisition and also significantly reduce the cost of data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lanes from aerial images with employment of the image analysis procedures. This algorithm starts with constructing the (Digital Surface Model) DSM and true orthophotos from the stereo images. Next, a maximum likelihood clustering algorithm is used to separate road from other ground objects. After the detection of road surface, the road traffic and lane lines are further detected using texture enhancement and morphological operations. Finally, the generated road network is evaluated to test the performance of the proposed approach, in which the datasets provided by Queensland department of Main Roads are used. The experiment result proves the effectiveness of our approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The highly variable flagellin-encoding flaA gene has long been used for genotyping Campylobacter jejuni and Campylobacter coli. High-resolution melting (HRM) analysis is emerging as an efficient and robust method for discriminating DNA sequence variants. The objective of this study was to apply HRM analysis to flaA-based genotyping. The initial aim was to identify a suitable flaA fragment. It was found that the PCR primers commonly used to amplify the flaA short variable repeat (SVR) yielded a mixed PCR product unsuitable for HRM analysis. However, a PCR primer set composed of the upstream primer used to amplify the fragment used for flaA restriction fragment length polymorphism (RFLP) analysis and the downstream primer used for flaA SVR amplification generated a very pure PCR product, and this primer set was used for the remainder of the study. Eighty-seven C. jejuni and 15 C. coli isolates were analyzed by flaA HRM and also partial flaA sequencing. There were 47 flaA sequence variants, and all were resolved by HRM analysis. The isolates used had previously also been genotyped using single-nucleotide polymorphisms (SNPs), binary markers, CRISPR HRM, and flaA RFLP. flaAHRManalysis provided resolving power multiplicative to the SNPs, binary markers, and CRISPR HRM and largely concordant with the flaA RFLP. It was concluded that HRM analysis is a promising approach to genotyping based on highly variable genes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper firstly presents an extended ambiguity resolution model that deals with an ill-posed problem and constraints among the estimated parameters. In the extended model, the regularization criterion is used instead of the traditional least squares in order to estimate the float ambiguities better. The existing models can be derived from the general model. Secondly, the paper examines the existing ambiguity searching methods from four aspects: exclusion of nuisance integer candidates based on the available integer constraints; integer rounding; integer bootstrapping and integer least squares estimations. Finally, this paper systematically addresses the similarities and differences between the generalized TCAR and decorrelation methods from both theoretical and practical aspects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identifying an individual from surveillance video is a difficult, time consuming and labour intensive process. The proposed system aims to streamline this process by filtering out unwanted scenes and enhancing an individual's face through super-resolution. An automatic face recognition system is then used to identify the subject or present the human operator with likely matches from a database. A person tracker is used to speed up the subject detection and super-resolution process by tracking moving subjects and cropping a region of interest around the subject's face to reduce the number and size of the image frames to be super-resolved respectively. In this paper, experiments have been conducted to demonstrate how the optical flow super-resolution method used improves surveillance imagery for visual inspection as well as automatic face recognition on an Eigenface and Elastic Bunch Graph Matching system. The optical flow based method has also been benchmarked against the ``hallucination'' algorithm, interpolation methods and the original low-resolution images. Results show that both super-resolution algorithms improved recognition rates significantly. Although the hallucination method resulted in slightly higher recognition rates, the optical flow method produced less artifacts and more visually correct images suitable for human consumption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The call to innovate is ubiquitous across the Australian educational policy context. The claims of innovative practices and environments that occur frequently in university mission statements, strategic plans and marketing literature suggest that this exhortation to innovate appears to have been taken up enthusiastically by the university sector. Throughout the history of universities, a range of reported deficiencies of higher education have worked to produce a notion of crisis. At present, it would seem that innovation is positioned as the solution to the notion of crisis. This thesis is an inquiry into how the insistence on innovation works to both enable and constrain teaching and learning practices in Australian universities. Alongside the interplay between innovation and crisis is the link between resistance and innovation, a link which remains largely unproblematized in the scholarly literature. This thesis works to locate and unsettle understandings of a relationship between innovation and Australian higher education. The aim of this inquiry is to generate new understandings of what counts as innovation within this context and how innovation is enacted. The thesis draws on a number of postmodernist theorists, whose works have informed firstly the research method, and then the analysis and findings. Firstly, there is an assumption that power is capillary and works through discourse to enact power relations which shape certain truths (Foucault, 1990). Secondly, this research scrutinised language practices which frame the capacity for individuals to act, alongside the language practices which encourage an individual to adopt certain attitudes and actions as one’s own (Foucault, 1988). Thirdly, innovation talk is read in this thesis as an example of needs talk, that is, as a medium through which what is considered domestic, political or economic is made and contested (Fraser, 1989). Fourthly, relationships between and within discourses were identified and analysed beyond cause and effect descriptions, and more productively considered to be in a constant state of becoming (Deleuze, 1987). Finally, the use of ironic research methods assisted in producing alternate configurations of innovation talk which are useful and new (Rorty, 1989). The theoretical assumptions which underpin this thesis inform a document analysis methodology, used to examine how certain texts work to shape the ways in which innovation is constructed. The data consisted of three Federal higher education funding policies selected on the rationale that these documents, as opposed to state or locally based policy and legislation, represent the only shared policy context for all Australian universities. The analysis first provided a modernist reading of the three documents, and this was followed by postmodernist readings of these same policy documents. The modernist reading worked to locate and describe the current truths about innovation. The historical context in which the policy was produced as well as the textual features of the document itself were important to this reading. In the first modernist reading, the binaries involved in producing proper and improper notions of innovation were described and analysed. In the process of the modernist analysis and the subsequent location of binary organisation, a number of conceptual collisions were identified, and these sites of struggle were revisited, through the application of a postmodernist reading. By applying the theories of Rorty (1989) and Fraser (1989) it became possible to not treat these sites as contradictory and requiring resolution, but rather as spaces in which binary tensions are necessary and productive. This postmodernist reading constructed new spaces for refusing and resisting dominant discourses of innovation which value only certain kinds of teaching and learning practices. By exploring a number of ironic language practices found within the policies, this thesis proposes an alternative way of thinking about what counts as innovation and how it happens. The new readings of innovation made possible through the work of this thesis were in response to a suite of enduring, inter-related questions – what counts as innovation?, who or what supports innovation?, how does innovation occur?, and who are the innovators?. The truths presented in response to these questions were treated as the language practices which constitute a dominant discourse of innovation talk. The collisions that occur within these truths were the contested sites which were of most interest for the analysis. The thesis concludes by presenting a theoretical blueprint which works to shift the boundaries of what counts as innovation and how it happens in a manner which is productive, inclusive and powerful. This blueprint forms the foundation upon which a number of recommendations are made for both my own professional practice and broader contexts. In keeping with the conceptual tone of this study, these recommendations are a suite of new questions which focus attention on the boundaries of innovation talk as an attempt to re-configure what is valued about teaching and learning at university.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typically a film producer expects the director and actors to 'do their job' within a scheduled timeframe. Rather than expecting the creative principals to just deliver, a production model can be tailored to help this creative team produce successful outcomes. This research paper contrasts alternative production models with a traditional (or standard) production and presents possibilities for producers to emphasise the collaborative potential for their production.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyse the puzzling behavior of the volatility of individual stock returns around the turn of the Millennium. There has been much academic interest in this topic, but no convincing explanation has arisen. Our goal is to pull together the many competing explanations currently proposed in the literature to delermine which, if any, are capable of explaining the volatility trend. We find that many of the different explanations capture the same unusual trend around the Millennium. We find that many of the variables are very highly correlated and it is thus difficult to disentangle their relalive ability to exlplain the time-series behavior in volatility. It seems thai all of the variables that track average volatility well do so mainly by capturing changes in the post-1994 period. These variables have no time-series explanatory power in the pre-1995 years, questioning the underlying idea that any of the explanations currently plesented in the literature can track the trend in volatility over long periods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently the Bachelor of Design is the generic degree offered to the four disciplines of Architecture, Landscape Architecture, Industrial Design, and Interior Design within the School of Design at the Queensland University of Technology. Regardless of discipline, Digital Communication is a core unit taken by the 600 first year students entering the Bachelor of Design degree. Within the design disciplines the communication of the designer's intentions is achieved primarily through the use of graphic images, with written information being considered as supportive or secondary. As such, Digital Communication attempts to educate learners in the fundamentals of this graphic design communication, using a generic digital or software tool. Past iterations of the unit have not acknowledged the subtle difference in design communication of the different design disciplines involved, and has used a single generic software tool. Following a review of the unit in 2008, it was decided that a single generic software tool was no longer entirely sufficient. This decision was based on the recognition that there was an increasing emergence of discipline specific digital tools, and an expressed student desire and apparent aptitude to learn these discipline specific tools. As a result the unit was reconstructed in 2009 to offer both discipline specific and generic software instruction, if elected by the student. This paper, apart from offering the general context and pedagogy of the existing and restructured units, will more importantly offer research data that validates the changes made to the unit. Most significant of this new data is the results of surveys that authenticate actual student aptitude versus desire in learning discipline specific tools. This is done through an exposure of student self efficacy in problem resolution and technological prowess - generally and specifically within the unit. More traditional means of validation is also presented that includes the results of the generic university-wide Learning Experience Survey of the unit, as well as a comparison between the assessment results of the restructured unit versus the previous year.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter is about the role of law in the creation and operation of Australian health systems. Accordingly, this chapter discusses how law regulates the way in which health services in Australia are funded, organised, regulated, managed, operated and governed. (The question of how health professionals are regulated is discussed in Chapter 15.) Although the focus of much of health law is on legal mechanisms for the resolution of disputes or disagreements between the state, health providers, professionals, patients and families and friends, and through dispute resolutions processes setting standards for practice, these are only some of the “jobs” that health law performs. In health systems where the state undertakes a significant role in regulating, funding, managing and providing health services, health law also performs an important constitutive function. Health law declares the values upon which the health system is based, shapes social processes to achieve public ends and provides a structure for the complex interactions that occur within a modern health system. Health law regulates decision-makers in health systems by establishing who has the power to participate in decisions and in what circumstances, establishing processes through which decisions are made and creating mechanisms for decision-makers to be held publicly accountable. It is this broader constitutive function of health law that is a primary focus of much of this chapter — how and why governments use their legislative powers to structure and shape the health system.