863 resultados para Gaussian Plume model for multiple sources foe Cochin


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative collision warning system for road vehicles, enabled by recent advances in positioning systems and wireless communication technologies, can potentially reduce traffic accident significantly. To improve the system, we propose a graph model to represent interactions between multiple road vehicles in a specific region and at a specific time. Given a list of vehicles in vicinity, we can generate the interaction graph using several rules that consider vehicle's properties such as position, speed, heading, etc. Safety applications can use the model to improve emergency warning accuracy and optimize wireless channel usage. The model allows us to develop some congestion control strategies for an efficient multi-hop broadcast protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The proliferation of innovative schemes to address climate change at international, national and local levels signals a fundamental shift in the priority and role of the natural environment to society, organizations and individuals. This shift in shared priorities invites academics and practitioners to consider the role of institutions in shaping and constraining responses to climate change at multiple levels of organisations and society. Institutional theory provides an approach to conceptualising and addressing climate change challenges by focusing on the central logics that guide society, organizations and individuals and their material and symbolic relationship to the environment. For example, framing a response to climate change in the form of an emission trading scheme evidences a practice informed by a capitalist market logic (Friedland and Alford 1991). However, not all responses need necessarily align with a market logic. Indeed, Thornton (2004) identifies six broad societal sectors each with its own logic (markets, corporations, professions, states, families, religions). Hence, understanding the logics that underpin successful –and unsuccessful– climate change initiatives contributes to revealing how institutions shape and constrain practices, and provides valuable insights for policy makers and organizations. This paper develops models and propositions to consider the construction of, and challenges to, climate change initiatives based on institutional logics (Thornton and Ocasio 2008). We propose that the challenge of understanding and explaining how climate change initiatives are successfully adopted be examined in terms of their institutional logics, and how these logics evolve over time. To achieve this, a multi-level framework of analysis that encompasses society, organizations and individuals is necessary (Friedland and Alford 1991). However, to date most extant studies of institutional logics have tended to emphasize one level over the others (Thornton and Ocasio 2008: 104). In addition, existing studies related to climate change initiatives have largely been descriptive (e.g. Braun 2008) or prescriptive (e.g. Boiral 2006) in terms of the suitability of particular practices. This paper contributes to the literature on logics by examining multiple levels: the proliferation of the climate change agenda provides a site in which to study how institutional logics are played out across multiple, yet embedded levels within society through institutional forums in which change takes place. Secondly, the paper specifically examines how institutional logics provide society with organising principles –material practices and symbolic constructions– which enable and constrain their actions and help define their motives and identity. Based on this model, we develop a series of propositions of the conditions required for the successful introduction of climate change initiatives. The paper proceeds as follows. We present a review of literature related to institutional logics and develop a generic model of the process of the operation of institutional logics. We then consider how this is applied to key initiatives related to climate change. Finally, we develop a series of propositions which might guide insights into the successful implementation of climate change practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The host specificity of the five published sewage-associated Bacteroides markers (i.e., HF183, BacHum, HuBac, BacH and Human-Bac) was evaluated in Southeast Queensland, Australia by testing fecal DNA samples (n = 186) from 11 animal species including human fecal samples collected via influent to a sewage treatment plant (STP). All human fecal samples (n = 50) were positive for all five markers indicating 100% sensitivity of these markers. The overall specificity of the HF183 markers to differentiate between humans and animals was 99%. The specificities of the BacHum and BacH markers were > 94%, suggesting that these markers are suitable for sewage pollution in environmental waters in Australia. The BacHum (i.e., 63% specificity) and Human-Bac (i.e., 79% specificity) markers performed poorly in distinguishing between the sources of human and animal fecal samples. It is recommended that the specificity of the sewage-associated markers must be rigorously tested prior to its application to identify the sources of fecal pollution in environmental waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An asset registry arguably forms the core system that needs to be in place before other systems can operate or interoperate. Most systems have rudimentary asset registry functionality that store assets, relationships, or characteristics, and this leads to different asset management systems storing similar sets of data in multiple locations in an organisation. As organisations have been slowly moving their information architecture toward a service-oriented architecture, they have also been consolidating their multiple data stores, to form a “single point of truth”. As part of a strategy to integrate several asset management systems in an Australian railway organisation, a case study for developing a consolidated asset registry was conducted. A decision was made to use the MIMOSA OSA-EAI CRIS data model as well as the OSA-EAI Reference Data in building the platform due to the standard’s relative maturity and completeness. A pilot study of electrical traction equipment was selected, and the data sources feeding into the asset registry were primarily diagrammatic based. This paper presents the pitfalls encountered, approaches taken, and lessons learned during the development of the asset registry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The assessment of intellectual ability is a core competency in psychology. The results of intelligence tests have many potential implications and are used frequently as the basis for decisions about educational placements, eligibility for various services, and admission to specific groups. Given the importance of intelligence test scores, accurate test administration and scoring are essential; yet there is evidence of unacceptably high rates of examiner error. This paper discusses competency and postgraduate training in intelligence testing and presents a training model for postgraduate psychology students. The model aims to achieve high levels of competency in intelligence testing through a structured method of training, practice and feedback that incorporates peer support, self-reflection and multiple methods for evaluating competency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc. The geometric and dosimetric accuracy of CTCombine’s output has been assessed by simulating simple and complex treatments applied to a rotated planar phantom and a rotated humanoid phantom and comparing the resulting virtual EPID images with the images acquired using experimental measurements and independent simulations of equivalent phantoms. It is expected that CTCombine will be useful for Monte Carlo studies of EPID dosimetry as well as other EPID imaging applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of designing a surveillance system to detect a broad range of invasive species across a heterogeneous sampling frame. We present a model to detect a range of invertebrate invasives whilst addressing the challenges of multiple data sources, stratifying for differential risk, managing labour costs and providing sufficient power of detection.We determine the number of detection devices required and their allocation across the landscape within limiting resource constraints. The resulting plan will lead to reduced financial and ecological costs and an optimal surveillance system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The researcher’s professional role as an Education Officer was the impetus for this study. Designing and implementing professional development activities is a significant component of the researcher’s position description and as a result of reflection and feedback from participants and colleagues, the creation of a more effective model of professional development became the focus for this study. Few studies have examined all three links between the purposes of professional development that is, increasing teacher knowledge, improving teacher practice, and improving student outcomes. This study is significant in that it investigates the nature of the growth of teachers who participated in a model of professional development which was based upon the principles of Lesson Study. The research provides qualitative and empirical data to establish some links between teacher knowledge, teacher practice, and student learning outcomes. Teacher knowledge in this study refers to mathematics content knowledge as well as pedagogical-content knowledge. The outcomes for students include achievement outcomes, attitudinal outcomes, and behavioural outcomes. As the study was conducted at one school-site, existence proof research was the focus of the methodology and data collection. Developing over the 2007 school year, with five teacher-participants and approximately 160 students from Year Levels 6 to 9, the Lesson Study-principled model of professional development provided the teacher-participants with on-site, on-going, and reflective learning based on their classroom environment. The focus area for the professional development was strategising the engagement with and solution of worded mathematics problems. A design experiment was used to develop the professional development as an intervention of prevailing teacher practice for which data were collected prior to and after the period of intervention. A model of teacher change was developed as an underpinning framework for the development of the study, and was useful in making decisions about data collection and analyses. Data sources consisted of questionnaires, pre-tests and post-tests, interviews, and researcher observations and field notes. The data clearly showed that: content knowledge and pedagogical-content knowledge were increased among the teacher-participants; teacher practice changed in a positive manner; and that a majority of students demonstrated improved learning outcomes. The positive changes to teacher practice are described in this study as the demonstrated use of mixed pedagogical practices rather than a polarisation to either traditional pedagogical practices or contemporary pedagogical practices. The improvement in student learning outcomes was most significant as improved achievement outcomes as indicated by the comparison of pre-test and post-test scores. The effectiveness of the Lesson Study-principled model of professional development used in this study was evaluated using Guskey’s (2005) Five Levels of Professional Development Evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study explores strategic decision-making (SDM) in micro-firms, an economically significant business subsector. As extant large- and small-firm literature currently proffers an incomplete characterization of SDM in very small enterprises, a multiple-case methodology was used to investigate how these firms make strategic decisions. Eleven Australian Information Technology service micro-firms participated in the study. Using an information-processing lens, the study uncovered patterns of SDM in micro-firms and derived a theoretical micro-firm SDM model. This research also identifies several implications for micro-firm management and directions for future research, contributing to the understanding of micro-firm SDM in both theory and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND:Previous epidemiological investigations of associations between dietary glycemic intake and insulin resistance have used average daily measures of glycemic index (GI) and glycemic load (GL). We explored multiple and novel measures of dietary glycemic intake to determine which was most predictive of an association with insulin resistance.METHODS:Usual dietary intakes were assessed by diet history interview in women aged 42-81 years participating in the Longitudinal Assessment of Ageing in Women. Daily measures of dietary glycemic intake (n = 329) were carbohydrate, GI, GL, and GL per megacalorie (GL/Mcal), while meal based measures (n = 200) were breakfast, lunch and dinner GL; and a new measure, GL peak score, to represent meal peaks. Insulin resistant status was defined as a homeostasis model assessment (HOMA) value of >3.99; HOMA as a continuous variable was also investigated.RESULTS:GL, GL/Mcal, carbohydrate (all P < 0.01), GL peak score (P = 0.04) and lunch GL (P = 0.04) were positively and independently associated with insulin resistant status. Daily measures were more predictive than meal-based measures, with minimal difference between GL/Mcal, GL and carbohydrate. No significant associations were observed with HOMA as a continuous variable.CONCLUSION:A dietary pattern with high peaks of GL above the individual's average intake was a significant independent predictor of insulin resistance in this population, however the contribution was less than daily GL and carbohydrate variables. Accounting for energy intake slightly increased the predictive ability of GL, which is potentially important when examining disease risk in more diverse populations with wider variations in energy requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An experimental investigation has been made of a round, non-buoyant plume of nitric oxide, NO, in a turbulent grid flow of ozone, 03, using the Turbulent Smog Chamber at the University of Sydney. The measurements have been made at a resolution not previously reported in the literature. The reaction is conducted at non-equilibrium so there is significant interaction between turbulent mixing and chemical reaction. The plume has been characterized by a set of constant initial reactant concentration measurements consisting of radial profiles at various axial locations. Whole plume behaviour can thus be characterized and parameters are selected for a second set of fixed physical location measurements where the effects of varying the initial reactant concentrations are investigated. Careful experiment design and specially developed chemilurninescent analysers, which measure fluctuating concentrations of reactive scalars, ensure that spatial and temporal resolutions are adequate to measure the quantities of interest. Conserved scalar theory is used to define a conserved scalar from the measured reactive scalars and to define frozen, equilibrium and reaction dominated cases for the reactive scalars. Reactive scalar means and the mean reaction rate are bounded by frozen and equilibrium limits but this is not always the case for the reactant variances and covariances. The plume reactant statistics are closer to the equilibrium limit than those for the ambient reactant. The covariance term in the mean reaction rate is found to be negative and significant for all measurements made. The Toor closure was found to overestimate the mean reaction rate by 15 to 65%. Gradient model turbulent diffusivities had significant scatter and were not observed to be affected by reaction. The ratio of turbulent diffusivities for the conserved scalar mean and that for the r.m.s. was found to be approximately 1. Estimates of the ratio of the dissipation timescales of around 2 were found downstream. Estimates of the correlation coefficient between the conserved scalar and its dissipation (parallel to the mean flow) were found to be between 0.25 and the significant value of 0.5. Scalar dissipations for non-reactive and reactive scalars were found to be significantly different. Conditional statistics are found to be a useful way of investigating the reactive behaviour of the plume, effectively decoupling the interaction of chemical reaction and turbulent mixing. It is found that conditional reactive scalar means lack significant transverse dependence as has previously been found theoretically by Klimenko (1995). It is also found that conditional variance around the conditional reactive scalar means is relatively small, simplifying the closure for the conditional reaction rate. These properties are important for the Conditional Moment Closure (CMC) model for turbulent reacting flows recently proposed by Klimenko (1990) and Bilger (1993). Preliminary CMC model calculations are carried out for this flow using a simple model for the conditional scalar dissipation. Model predictions and measured conditional reactive scalar means compare favorably. The reaction dominated limit is found to indicate the maximum reactedness of a reactive scalar and is a limiting case of the CMC model. Conventional (unconditional) reactive scalar means obtained from the preliminary CMC predictions using the conserved scalar p.d.f. compare favorably with those found from experiment except where measuring position is relatively far upstream of the stoichiometric distance. Recommendations include applying a full CMC model to the flow and investigations both of the less significant terms in the conditional mean species equation and the small variation of the conditional mean with radius. Forms for the p.d.f.s, in addition to those found from experiments, could be useful for extending the CMC model to reactive flows in the atmosphere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the formal recognition of practice-led research in the 1990s, many higher research degree candidates in art, design and media have submitted creative works along with an accompanying written document or ‘exegesis’ for examination. Various models for the exegesis have been proposed in university guidelines and academic texts during the past decade, and students and supervisors have experimented with its contents and structure. With a substantial number of exegeses submitted and archived, it has now become possible to move beyond proposition to empirical analysis. In this article we present the findings of a content analysis of a large, local sample of submitted exegeses. We identify the emergence of a persistent pattern in the types of content included as well as overall structure. Besides an introduction and conclusion, this pattern includes three main parts, which can be summarized as situating concepts (conceptual definitions and theories); precedents of practice (traditions and exemplars in the field); and researcher’s creative practice (the creative process, the artifacts produced and their value as research). We argue that this model combines earlier approaches to the exegesis, which oscillated between academic objectivity, by providing a contextual framework for the practice, and personal reflexivity, by providing commentary on the creative practice. But this model is more than simply a hybrid: it provides a dual orientation, which allows the researcher to both situate their creative practice within a trajectory of research and do justice to its personally invested poetics. By performing the important function of connecting the practice and creative work to a wider emergent field, the model helps to support claims for a research contribution to the field. We call it a connective model of exegesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Police work tasks are diverse and require the ability to take command, demonstrate leadership, make serious decisions and be self directed (Beck, 1999; Brunetto & Farr-Wharton, 2002; Howard, Donofrio & Boles, 2002). This work is usually performed in pairs or sometimes by an officer working alone. Operational police work is seldom performed under the watchful eyes of a supervisor and a great amount of reliance is placed on the high levels of motivation and professionalism of individual officers. Research has shown that highly motivated workers produce better outcomes (Whisenand & Rush, 1998; Herzberg, 2003). It is therefore important that Queensland police officers are highly motivated to provide a quality service to the Queensland community. This research aims to identify factors which motivate Queensland police to perform quality work. Researchers acknowledge that there is a lack of research and knowledge in regard to the factors which motivate police (Beck, 1999; Bragg, 1998; Howard, Donofrio & Boles, 2002; McHugh & Verner, 1998). The motivational factors were identified in regard to the demographic variables of; age, sex, rank, tenure and education. The model for this research is Herzberg’s two-factor theory of workplace motivation (1959). Herzberg found that there are two broad types of workplace motivational factors; those driven by a need to prevent loss or harm and those driven by a need to gain personal satisfaction or achievement. His study identified 16 basic sub-factors that operate in the workplace. The research utilised a questionnaire instrument based on the sub-factors identified by Herzberg (1959). The questionnaire format consists of an initial section which sought demographic information about the participant and is followed by 51 Likert scale questions. The instrument is an expanded version of an instrument previously used in doctoral studies to identify sources of police motivation (Holden, 1980; Chiou, 2004). The questionnaire was forwarded to approximately 960 police in the Brisbane, Metropolitan North Region. The data were analysed using Factor Analysis, MANOVAs, ANOVAs and multiple regression analysis to identify the key sources of police motivation and to determine the relationships between demographic variables such as: age, rank, educational level, tenure, generation cohort and motivational factors. A total of 484 officers responded to the questionnaire from the sample population of 960. Factor analysis revealed five broad Prime Motivational Factors that motivate police in their work. The Prime Motivational Factors are: Feeling Valued, Achievement, Workplace Relationships, the Work Itself and Pay and Conditions. The factor Feeling Valued highlighted the importance of positive supportive leaders in motivating officers. Many officers commented that supervisors who only provided negative feedback diminished their sense of feeling valued and were a key source of de-motivation. Officers also frequently commented that they were motivated by operational police work itself whilst demonstrating a strong sense of identity with their team and colleagues. The study showed a general need for acceptance by peers and an idealistic motivation to assist members of the community in need and protect victims of crime. Generational cohorts were not found to exert a significant influence on police motivation. The demographic variable with the single greatest influence on police motivation was tenure. Motivation levels were found to drop dramatically during the first two years of an officer’s service and generally not improve significantly until near retirement age. The findings of this research provide the foundation of a number of recommendations in regard to police retirement, training and work allocation that are aimed to improve police motivation levels. The five Prime Motivational Factor model developed in this study is recommended for use as a planning tool by police leaders to improve motivational and job-satisfaction components of police Service policies. The findings of this study also provide a better understanding of the current sources of police motivation. They are expected to have valuable application for Queensland police human resource management when considering policies and procedures in the areas of motivation, stress reduction and attracting suitable staff to specific areas of responsibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uninhabited aerial vehicles (UAVs) are a cutting-edge technology that is at the forefront of aviation/aerospace research and development worldwide. Many consider their current military and defence applications as just a token of their enormous potential. Unlocking and fully exploiting this potential will see UAVs in a multitude of civilian applications and routinely operating alongside piloted aircraft. The key to realising the full potential of UAVs lies in addressing a host of regulatory, public relation, and technological challenges never encountered be- fore. Aircraft collision avoidance is considered to be one of the most important issues to be addressed, given its safety critical nature. The collision avoidance problem can be roughly organised into three areas: 1) Sense; 2) Detect; and 3) Avoid. Sensing is concerned with obtaining accurate and reliable information about other aircraft in the air; detection involves identifying potential collision threats based on available information; avoidance deals with the formulation and execution of appropriate manoeuvres to maintain safe separation. This thesis tackles the detection aspect of collision avoidance, via the development of a target detection algorithm that is capable of real-time operation onboard a UAV platform. One of the key challenges of the detection problem is the need to provide early warning. This translates to detecting potential threats whilst they are still far away, when their presence is likely to be obscured and hidden by noise. Another important consideration is the choice of sensors to capture target information, which has implications for the design and practical implementation of the detection algorithm. The main contributions of the thesis are: 1) the proposal of a dim target detection algorithm combining image morphology and hidden Markov model (HMM) filtering approaches; 2) the novel use of relative entropy rate (RER) concepts for HMM filter design; 3) the characterisation of algorithm detection performance based on simulated data as well as real in-flight target image data; and 4) the demonstration of the proposed algorithm's capacity for real-time target detection. We also consider the extension of HMM filtering techniques and the application of RER concepts for target heading angle estimation. In this thesis we propose a computer-vision based detection solution, due to the commercial-off-the-shelf (COTS) availability of camera hardware and the hardware's relatively low cost, power, and size requirements. The proposed target detection algorithm adopts a two-stage processing paradigm that begins with an image enhancement pre-processing stage followed by a track-before-detect (TBD) temporal processing stage that has been shown to be effective in dim target detection. We compare the performance of two candidate morphological filters for the image pre-processing stage, and propose a multiple hidden Markov model (MHMM) filter for the TBD temporal processing stage. The role of the morphological pre-processing stage is to exploit the spatial features of potential collision threats, while the MHMM filter serves to exploit the temporal characteristics or dynamics. The problem of optimising our proposed MHMM filter has been examined in detail. Our investigation has produced a novel design process for the MHMM filter that exploits information theory and entropy related concepts. The filter design process is posed as a mini-max optimisation problem based on a joint RER cost criterion. We provide proof that this joint RER cost criterion provides a bound on the conditional mean estimate (CME) performance of our MHMM filter, and this in turn establishes a strong theoretical basis connecting our filter design process to filter performance. Through this connection we can intelligently compare and optimise candidate filter models at the design stage, rather than having to resort to time consuming Monte Carlo simulations to gauge the relative performance of candidate designs. Moreover, the underlying entropy concepts are not constrained to any particular model type. This suggests that the RER concepts established here may be generalised to provide a useful design criterion for multiple model filtering approaches outside the class of HMM filters. In this thesis we also evaluate the performance of our proposed target detection algorithm under realistic operation conditions, and give consideration to the practical deployment of the detection algorithm onboard a UAV platform. Two fixed-wing UAVs were engaged to recreate various collision-course scenarios to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. Based on this collected data, our proposed detection approach was able to detect targets out to distances ranging from about 400m to 900m. These distances, (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning ahead of impact that approaches the 12.5 second response time recommended for human pilots. Furthermore, readily available graphic processing unit (GPU) based hardware is exploited for its parallel computing capabilities to demonstrate the practical feasibility of the proposed target detection algorithm. A prototype hardware-in- the-loop system has been found to be capable of achieving data processing rates sufficient for real-time operation. There is also scope for further improvement in performance through code optimisations. Overall, our proposed image-based target detection algorithm offers UAVs a cost-effective real-time target detection capability that is a step forward in ad- dressing the collision avoidance issue that is currently one of the most significant obstacles preventing widespread civilian applications of uninhabited aircraft. We also highlight that the algorithm development process has led to the discovery of a powerful multiple HMM filtering approach and a novel RER-based multiple filter design process. The utility of our multiple HMM filtering approach and RER concepts, however, extend beyond the target detection problem. This is demonstrated by our application of HMM filters and RER concepts to a heading angle estimation problem.