364 resultados para conflict resolution mechanisms
Resumo:
In Australia seven schemes (apart from the Superannuation Complaints Tribunal) provide alternative dispute resolution services for complaints brought by consumers against financial services industry members. Recently the Supreme Court of New South Wales held that the decisions of one scheme were amenable to judicial review at the suit of a financial services provider member and the Supreme Court of Victoria has since taken a similar approach. This article examines the juristic basis for such a challenge and contends that judicial review is not available, either at common law or under statutory provisions. This is particularly the case since Financial Industry Complaints Service Ltd v Deakin Financial Services Pty Ltd (2006) 157 FCR 229; 60 ACSR 372 decided that the jurisdiction of a scheme is derived from a contract made with its members. The article goes on to contend that the schemes are required to give procedural fairness and that equitable remedies are available if that duty is breached.
Resumo:
Road features extraction from remote sensed imagery has been a long-term topic of great interest within the photogrammetry and remote sensing communities for over three decades. The majority of the early work only focused on linear feature detection approaches, with restrictive assumption on image resolution and road appearance. The widely available of high resolution digital aerial images makes it possible to extract sub-road features, e.g. road pavement markings. In this paper, we will focus on the automatic extraction of road lane markings, which are required by various lane-based vehicle applications, such as, autonomous vehicle navigation, and lane departure warning. The proposed approach consists of three phases: i) road centerline extraction from low resolution image, ii) road surface detection in the original image, and iii) pavement marking extraction on the generated road surface. The proposed method was tested on the aerial imagery dataset of the Bruce Highway, Queensland, and the results demonstrate the efficiency of our approach.
Resumo:
With the increasing resolution of remote sensing images, road network can be displayed as continuous and homogeneity regions with a certain width rather than traditional thin lines. Therefore, road network extraction from large scale images refers to reliable road surface detection instead of road line extraction. In this paper, a novel automatic road network detection approach based on the combination of homogram segmentation and mathematical morphology is proposed, which includes three main steps: (i) the image is classified based on homogram segmentation to roughly identify the road network regions; (ii) the morphological opening and closing is employed to fill tiny holes and filter out small road branches; and (iii) the extracted road surface is further thinned by a thinning approach, pruned by a proposed method and finally simplified with Douglas-Peucker algorithm. Lastly, the results from some QuickBird images and aerial photos demonstrate the correctness and efficiency of the proposed process.
Resumo:
Accurate road lane information is crucial for advanced vehicle navigation and safety applications. With the increasing of very high resolution (VHR) imagery of astonishing quality provided by digital airborne sources, it will greatly facilitate the data acquisition and also significantly reduce the cost of data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lanes from aerial images with employment of the image analysis procedures. This algorithm starts with constructing the (Digital Surface Model) DSM and true orthophotos from the stereo images. Next, a maximum likelihood clustering algorithm is used to separate road from other ground objects. After the detection of road surface, the road traffic and lane lines are further detected using texture enhancement and morphological operations. Finally, the generated road network is evaluated to test the performance of the proposed approach, in which the datasets provided by Queensland department of Main Roads are used. The experiment result proves the effectiveness of our approach.
Resumo:
The highly variable flagellin-encoding flaA gene has long been used for genotyping Campylobacter jejuni and Campylobacter coli. High-resolution melting (HRM) analysis is emerging as an efficient and robust method for discriminating DNA sequence variants. The objective of this study was to apply HRM analysis to flaA-based genotyping. The initial aim was to identify a suitable flaA fragment. It was found that the PCR primers commonly used to amplify the flaA short variable repeat (SVR) yielded a mixed PCR product unsuitable for HRM analysis. However, a PCR primer set composed of the upstream primer used to amplify the fragment used for flaA restriction fragment length polymorphism (RFLP) analysis and the downstream primer used for flaA SVR amplification generated a very pure PCR product, and this primer set was used for the remainder of the study. Eighty-seven C. jejuni and 15 C. coli isolates were analyzed by flaA HRM and also partial flaA sequencing. There were 47 flaA sequence variants, and all were resolved by HRM analysis. The isolates used had previously also been genotyped using single-nucleotide polymorphisms (SNPs), binary markers, CRISPR HRM, and flaA RFLP. flaAHRManalysis provided resolving power multiplicative to the SNPs, binary markers, and CRISPR HRM and largely concordant with the flaA RFLP. It was concluded that HRM analysis is a promising approach to genotyping based on highly variable genes.
Resumo:
This paper firstly presents an extended ambiguity resolution model that deals with an ill-posed problem and constraints among the estimated parameters. In the extended model, the regularization criterion is used instead of the traditional least squares in order to estimate the float ambiguities better. The existing models can be derived from the general model. Secondly, the paper examines the existing ambiguity searching methods from four aspects: exclusion of nuisance integer candidates based on the available integer constraints; integer rounding; integer bootstrapping and integer least squares estimations. Finally, this paper systematically addresses the similarities and differences between the generalized TCAR and decorrelation methods from both theoretical and practical aspects.
Resumo:
This study was aimed at examining the safety climate and relational conflict within teams at the individual level. A sample of 372 respondents, divided into 50 teams, was used to test our hypothesis. It was proposed - and discovered - that team members’ individual differences in need for closure mitigated the negative relationship between perceptions of team safety climate and team relational conflict. The implications of our findings and the study’s limitations are discussed.
Resumo:
Identifying an individual from surveillance video is a difficult, time consuming and labour intensive process. The proposed system aims to streamline this process by filtering out unwanted scenes and enhancing an individual's face through super-resolution. An automatic face recognition system is then used to identify the subject or present the human operator with likely matches from a database. A person tracker is used to speed up the subject detection and super-resolution process by tracking moving subjects and cropping a region of interest around the subject's face to reduce the number and size of the image frames to be super-resolved respectively. In this paper, experiments have been conducted to demonstrate how the optical flow super-resolution method used improves surveillance imagery for visual inspection as well as automatic face recognition on an Eigenface and Elastic Bunch Graph Matching system. The optical flow based method has also been benchmarked against the ``hallucination'' algorithm, interpolation methods and the original low-resolution images. Results show that both super-resolution algorithms improved recognition rates significantly. Although the hallucination method resulted in slightly higher recognition rates, the optical flow method produced less artifacts and more visually correct images suitable for human consumption.
Resumo:
In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.
Resumo:
The literature identifies several models that describe inter-phase mass transfer, key to the emission process. While the emission process is complex and these models may be more or less successful at predicting mass transfer rates, they identify three key variables for a system involving a liquid and an air phase in contact with it: • A concentration (or partial pressure) gradient driving force; • The fluid dynamic characteristics within the liquid and air phases, and • The chemical properties of the individual components within the system. In three applied research projects conducted prior to this study, samples collected with two well-known sampling devices resulted in very different odour emission rates. It was not possible to adequately explain the differences observed. It appeared likely, however, that the sample collection device might have artefact effects on the emission of odorants, i.e. the sampling device appeared to have altered the mass transfer process. This raised the obvious question: Where two different emission rates are reported for a single source (differing only in the selection of sampling device), and a credible explanation for the difference in emission rate cannot be provided, which emission rate is correct? This research project aimed to identify the factors that determine odour emission rates, the impact that the characteristics of a sampling device may exert on the key mass transfer variables, and ultimately, the impact of the sampling device on the emission rate itself. To meet these objectives, a series of targeted reviews, and laboratory and field investigations, were conducted. Two widely-used, representative devices were chosen to investigate the influence of various parameters on the emission process. These investigations provided insight into the odour emission process generally, and the influence of the sampling device specifically.
Resumo:
This naturalistic study investigated the mechanisms of change in measures of negative thinking and in 24-h urinary metabolites of noradrenaline (norepinephrine), dopamine and serotonin in a sample of 43 depressed hospital patients attending an eight-session group cognitive behavior therapy program. Most participants (91%) were taking antidepressant medication throughout the therapy period according to their treating Psychiatrists' prescriptions. The sample was divided into outcome categories (19 Responders and 24 Non-responders) on the basis of a clinically reliable change index [Jacobson, N.S., & Truax, P., 1991. Clinical significance: a statistical approach to defining meaningful change in psychotherapy research. Journal of Consulting and Clinical Psychology, 59, 12–19.] applied to the Beck Depression Inventory scores at the end of the therapy. Results of repeated measures analysis of variance [ANOVA] analyses of variance indicated that all measures of negative thinking improved significantly during therapy, and significantly more so in the Responders as expected. The treatment had a significant impact on urinary adrenaline and metadrenaline excretion however, these changes occurred in both Responders and Non-responders. Acute treatment did not significantly influence the six other monoamine metabolites. In summary, changes in urinary monoamine levels during combined treatment for depression were not associated with self-reported changes in mood symptoms.
Resumo:
Principal Topic: Resource decisions are critical to the venture creation process, which has important subsequent impacts on venture creation and performance (Boeker, 1989). Most entrepreneurs however, suffer substantial resource constraints in venture creation and during venture growth (Shepherd et al., 2000). Little is known about how high potential, sustainability ventures (the ventures of interest in this research), despite resource constraints, achieve continued venture persistence and venture success. One promising theory that explicitly links to resource constraints is a concept developed by Levi Strauss (1967) termed bricolage. Bricolage aligns with notions of resourcefulness: using what's on hand, through making do, and recombining resources for new or novel purposes (Baker & Nelson 2005). To the best of our knowledge, previous studies have not systematically investigated internal and external constraints, their combinations, and subsequent bricolage patterns. The majority of bricolage literature focuses on external environmental constraints (e.g. Wieck 1989; Baker & Nelson 2005), thereby paying less attention to in evaluating internal constraints (e.g. skills and capabilities) or constraint combinations. In this paper we focus on ventures that typically face resource-poor environments. High potential, nascent and young sustainability ventures are often created and developed with resource constraints and in some cases, have greater resource requirements owing to higher levels of technical sophistication of their products (Rothaermel & Deeds 2006). These ventures usually have high aspirations and potential for growth who ''seeks to meet the needs and aspirations without compromising the ability to meet those of the future'' (Brundtland Commission 1983). High potential ventures are increasingly attributed with a central role in the development of innovation, and employment in developed economies (Acs 2008). Further, increasing awareness of environmental and sustainability issues has fostered demand for business processes that reduce detrimental environmental impacts of global development (Dean & McMullen 2007) and more environmentally sensitive products and services: representing an opportunity for the development of ventures that seek to satisfy this demand through entrepreneurial action. These ventures may choose to ''make do'' with existing resources in developing resource combinations that produce the least impact on the environment. The continuous conflict between the greater requirements for resources and limited resource availability in high potential sustainable ventures, with the added complexity of balancing this with an uncompromising focus on using ''what's on hand'' to lessen environment impacts may make bricolage behaviours critical for these ventures. Research into bricolage behaviour is however, the exception rather than the rule (Cunha 2005). More research is therefore needed to further develop and extend this emerging concept, especially in the context of sustainability ventures who are committed to personal and social goals of resourcefulness. To date, however, bricolage has not been studied specifically among high potential sustainable ventures. This research seeks to develop an in depth understanding of the impact of internal and external constraints and their combinations on the mechanisms employed in bricolage behaviours in differing dynamic environments. The following research question was developed to investigate this: How do internal, external resource constraints (or their combinations) impact bricolage resource decisions in high potential sustainability ventures? ---------- Methodology/Key Propositions: 6 case studies will be developed utilizing survey data from the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) large-scale longitudinal study of new venture start-ups in Australia. Prior to commencing case studies, 6 scoping interviews were conducted with key stakeholders including industry members, established businesses and government to ensure practical relevance in case development. The venture is considered the unit of analysis with the key informant being the entrepreneur and other management team members where appropriate. Triangulation techniques are used in this research including semi-structured interviews, survey data, onsite visits and secondary documentation website analysis, resumes, and business plans. These 6 sustainability ventures have been selected based on different environmental dynamism conditions including a traditionally mature market (building industry) and a more dynamic, evolving industry (renewable energy/solar ventures). In evaluating multidisciplinary literature, we expect the following external constraints are critical including: technology constraints (seen through lock-in of incumbents existing technology), institutional regulation and standards, access to markets, knowledge and training to nascent and young venture bricolage processes. The case studies will investigate internal constraints including resource fungability, resource combination capabilities, translating complex science/engineering knowledge into salient, valuable market propositions, i.e. appropriate market outcomes, and leveraging relationships may further influence bricolage decisions. ---------- Results and Implications: Intended ventures have been identified within the CAUSEE sample and have agreed to participate and secondary data collection for triangulation purposes has already commenced. Data collection of the case studies commenced 27th of May 2009. Analysis is expected to be completed finalised by 25th September 2009. This paper will report on the pattern of resource constraints and its impact on bricolage behaviours: its subsequent impact on resource deployment within venture creation and venture growth. As such, this research extends the theory of bricolage through the systematic analysis of constraints on resource management processes in sustainability ventures. For practice, this research may assist in providing a better understanding of the resource requirements and processes needed for continued venture persistence and growth in sustainability ventures. In these times of economic uncertainty, a better understanding of the influence on constraints and bricolage: the interplay of behaviours, processes and outcomes may enable greater venture continuance and success.
Resumo:
Since the Good Friday Agreement of 1998, large sums have been invested in community theatre projects in Northern Ireland, in the interests of conflict transformation and peace building. While this injection of funds has resulted in an unprecedented level of applied theatre activity, opportunities to maximise learning from this activity are being missed. It is generally assumed that project evaluation is undertaken at least partly to assess the degree of success of projects against important social objectives, with a view to learning what works, what does not, and what might work in the future. However, three ethnographic case studies of organisations delivering applied theatre projects in Northern Ireland indicate that current processes used to evaluate such projects are both flawed and inadequate for this purpose. Practitioners report that the administrative work involved in applying for and justifying funding is onerous, burdensome, and occurs at the expense of artistic activity. This is a very real concern when the time and effort devoted to ‘filling out the forms’ does not ultimately result in useful evaluative information. There are strong disincentives for organisations to report honestly on their experiences of difficulties, or undesirable impacts of projects, and this problem is not transcended by the use of external evaluators. Current evaluation processes provide little opportunity to capture unexpected benefits of projects, and small but significant successes which occur in the context of over-ambitious objectives. Little or no attempt is made to assess long-term impacts of projects on communities. Finally, official evaluation mechanisms fail to capture the reflective practice and dialogic analysis of practitioners, which would richly inform future projects. The authors argue that there is a need for clearer lines of communication, and more opportunities for mutual learning, among stakeholders involved in community development. In particular, greater involvement of the higher education sector in partnership with government and non-government agencies could yield significant benefits in terms of optimizing learning from applied theatre project evaluations.
Resumo:
Currently the Bachelor of Design is the generic degree offered to the four disciplines of Architecture, Landscape Architecture, Industrial Design, and Interior Design within the School of Design at the Queensland University of Technology. Regardless of discipline, Digital Communication is a core unit taken by the 600 first year students entering the Bachelor of Design degree. Within the design disciplines the communication of the designer's intentions is achieved primarily through the use of graphic images, with written information being considered as supportive or secondary. As such, Digital Communication attempts to educate learners in the fundamentals of this graphic design communication, using a generic digital or software tool. Past iterations of the unit have not acknowledged the subtle difference in design communication of the different design disciplines involved, and has used a single generic software tool. Following a review of the unit in 2008, it was decided that a single generic software tool was no longer entirely sufficient. This decision was based on the recognition that there was an increasing emergence of discipline specific digital tools, and an expressed student desire and apparent aptitude to learn these discipline specific tools. As a result the unit was reconstructed in 2009 to offer both discipline specific and generic software instruction, if elected by the student. This paper, apart from offering the general context and pedagogy of the existing and restructured units, will more importantly offer research data that validates the changes made to the unit. Most significant of this new data is the results of surveys that authenticate actual student aptitude versus desire in learning discipline specific tools. This is done through an exposure of student self efficacy in problem resolution and technological prowess - generally and specifically within the unit. More traditional means of validation is also presented that includes the results of the generic university-wide Learning Experience Survey of the unit, as well as a comparison between the assessment results of the restructured unit versus the previous year.