129 resultados para ESTABLISHING OPERATIONS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Snakehead fishes in the family Channidae are obligate freshwater fishes represented by two extant genera, the African Parachannna and the Asian Channa. These species prefer still or slow flowing water bodies, where they are top predators that exercise high levels of parental care, have the ability to breathe air, can tolerate poor water quality, and interestingly, can aestivate or traverse terrestrial habitat in response to seasonal changes in freshwater habitat availability. These attributes suggest that snakehead fishes may possess high dispersal potential, irrespective of the terrestrial barriers that would otherwise constrain the distribution of most freshwater fishes. A number of biogeographical hypotheses have been developed to account for the modern distributions of snakehead fishes across two continents, including ancient vicariance during Gondwanan break-up, or recent colonisation tracking the formation of suitable climatic conditions. Taxonomic uncertainty also surrounds some members of the Channa genus, as geographical distributions for some taxa across southern and Southeast (SE) Asia are very large, and in one case is highly disjunct. The current study adopted a molecular genetics approach to gain an understanding of the evolution of this group of fishes, and in particular how the phylogeography of two Asian species may have been influenced by contemporary versus historical levels of dispersal and vicariance. First, a molecular phylogeny was constructed based on multiple DNA loci and calibrated with fossil evidence to provide a dated chronology of divergence events among extant species, and also within species with widespread geographical distributions. The data provide strong evidence that trans-continental distribution of the Channidae arose as a result of dispersal out of Asia and into Africa in the mid–Eocene. Among Asian Channa, deep divergence among lineages indicates that the Oligocene-Miocene boundary was a time of significant species radiation, potentially associated with historical changes in climate and drainage geomorphology. Mid-Miocene divergence among lineages suggests that a taxonomic revision is warranted for two taxa. Deep intra-specific divergence (~8Mya) was also detected between C. striata lineages that occur sympatrically in the Mekong River Basin. The study then examined the phylogeography and population structure of two major taxa, Channa striata (the chevron snakehead) and the C. micropeltes (the giant snakehead), across SE Asia. Species specific microsatellite loci were developed and used in addition to a mitochondrial DNA marker (Cyt b) to screen neutral genetic variation within and among wild populations. C. striata individuals were sampled across SE Asia (n=988), with the major focus being the Mekong Basin, which is the largest drainage basin in the region. The distributions of two divergent lineages were identified and admixture analysis showed that where they co-occur they are interbreeding, indicating that after long periods of evolution in isolation, divergence has not resulted in reproductive isolation. One lineage is predominantly confined to upland areas of northern Lao PDR to the north of the Khorat Plateau, while the other, which is more closely related to individuals from southern India, has a widespread distribution across mainland SE Asian and Sumatra. The phylogeographical pattern recovered is associated with past river networks, and high diversity and divergence among all populations sampled reveal that contemporary dispersal is very low for this taxon, even where populations occur in contiguous freshwater habitats. C. micropeltes (n=280) were also sampled from across the Mekong River Basin, focusing on the lower basin where it constitutes an important wild fishery resource. In comparison with C. striata, allelic diversity and genetic divergence among populations were extremely low, suggesting very recent colonisation of the greater Mekong region. Populations were significantly structured into at least three discrete populations in the lower Mekong. Results of this study have implications for establishing effective conservation plans for managing both species, that represent economically important wild fishery resources for the region. For C. micropeltes, it is likely that a single fisheries stock in the Tonle Sap Great Lake is being exploited by multiple fisheries operations, and future management initiatives for this species in this region will need to account for this. For C. striata, conservation of natural levels of genetic variation will require management initiatives designed to promote population persistence at very localised spatial scales, as the high level of population structuring uncovered for this species indicates that significant unique diversity is present at this fine spatial scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operations management is an area concerned with the production of goods and services ensuring that business operations are efficient in utilizing resource and effective to meet customer requirements. It deals with the design and management of products, processes, services and supply chains and considers the acquisition, development, and effective and efficient utilization of resources. Unlike other engineering subjects, content of these units could be very wide and vast. It is therefore necessary to cover the content that is most related to the contemporary industries. It is also necessary to understand what engineering management skills are critical for engineers working in the contemporary organisations. Most of the operations management books contain traditional Operations Management techniques. For example ‘inventory management’ is an important topic in operations management. All OM books deal with effective method of inventory management. However, new trend in OM is Just in time (JIT) delivery or minimization of inventory. It is therefore important to decide whether to emphasise on keeping inventory (as suggested by most books) or minimization of inventory. Similarly, for OM decisions like forecasting, optimization and linear programming most organisations now a day’s use software. Now it is important for us to determine whether some of these software need to be introduced in tutorial/ lab classes. If so, what software? It is established in the Teaching and Learning literature that there must be a strong alignment between unit objectives, assessment and learning activities to engage students in learning. Literature also established that engaging students is vital for learning. However, engineering units (more specifically Operations management) is quite different from other majors. Only alignment between objectives, assessment and learning activities cannot guarantee student engagement. Unit content must be practical oriented and skills to be developed should be those demanded by the industry. Present active learning research, using a multi-method research approach, redesigned the operations management content based on latest developments in Engineering Management area and the necessity of Australian industries. The redesigned unit has significantly helped better student engagement and better learning. It was found that students are engaged in the learning if they find the contents are helpful in developing skills that are necessary in their practical life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Victorian Aboriginal Community Controlled Health Organisation (VACCHO) conducted a research project to find out the impact of social determinants such as education, employment, income, racism and housing on the health and wellbeing of Aboriginal peoples. Forums and workshops were conducted to establish future research outcomes and priorities. The project will help in developing localised and community-oriented solutions to Aboriginal health issues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: In response to the need for more comprehensive quality assessment within Australian residential aged care facilities, the Clinical Care Indicator (CCI) Tool was developed to collect outcome data as a means of making inferences about quality. A national trial of its effectiveness and a Brisbane-based trial of its use within the quality improvement context determined the CCI Tool represented a potentially valuable addition to the Australian aged care system. This document describes the next phase in the CCI Tool.s development; the aims of which were to establish validity and reliability of the CCI Tool, and to develop quality indicator thresholds (benchmarks) for use in Australia. The CCI Tool is now known as the ResCareQA (Residential Care Quality Assessment). Methods: The study aims were achieved through a combination of quantitative data analysis, and expert panel consultations using modified Delphi process. The expert panel consisted of experienced aged care clinicians, managers, and academics; they were initially consulted to determine face and content validity of the ResCareQA, and later to develop thresholds of quality. To analyse its psychometric properties, ResCareQA forms were completed for all residents (N=498) of nine aged care facilities throughout Queensland. Kappa statistics were used to assess inter-rater and test-retest reliability, and Cronbach.s alpha coefficient calculated to determine internal consistency. For concurrent validity, equivalent items on the ResCareQA and the Resident Classification Scales (RCS) were compared using Spearman.s rank order correlations, while discriminative validity was assessed using known-groups technique, comparing ResCareQA results between groups with differing care needs, as well as between male and female residents. Rank-ordered facility results for each clinical care indicator (CCI) were circulated to the panel; upper and lower thresholds for each CCI were nominated by panel members and refined through a Delphi process. These thresholds indicate excellent care at one extreme and questionable care at the other. Results: Minor modifications were made to the assessment, and it was renamed the ResCareQA. Agreement on its content was reached after two Delphi rounds; the final version contains 24 questions across four domains, enabling generation of 36 CCIs. Both test-retest and inter-rater reliability were sound with median kappa values of 0.74 (test-retest) and 0.91 (inter-rater); internal consistency was not as strong, with a Chronbach.s alpha of 0.46. Because the ResCareQA does not provide a single combined score, comparisons for concurrent validity were made with the RCS on an item by item basis, with most resultant correlations being quite low. Discriminative validity analyses, however, revealed highly significant differences in total number of CCIs between high care and low care groups (t199=10.77, p=0.000), while the differences between male and female residents were not significant (t414=0.56, p=0.58). Clinical outcomes varied both within and between facilities; agreed upper and lower thresholds were finalised after three Delphi rounds. Conclusions: The ResCareQA provides a comprehensive, easily administered means of monitoring quality in residential aged care facilities that can be reliably used on multiple occasions. The relatively modest internal consistency score was likely due to the multi-factorial nature of quality, and the absence of an aggregate result for the assessment. Measurement of concurrent validity proved difficult in the absence of a gold standard, but the sound discriminative validity results suggest that the ResCareQA has acceptable validity and could be confidently used as an indication of care quality within Australian residential aged care facilities. The thresholds, while preliminary due to small sample size, enable users to make judgements about quality within and between facilities. Thus it is recommended the ResCareQA be adopted for wider use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for calculating the in-bucket payload volume on a dragline for the purpose of estimating the material’s bulk density in real-time. Knowledge of the bulk density can provide instant feedback to mine planning and scheduling to improve blasting and in turn provide a more uniform bulk density across the excavation site. Furthermore costs and emissions in dragline operation, maintenance and downstream material processing can be reduced. The main challenge is to determine an accurate position and orientation of the bucket with the constraint of real-time performance. The proposed solution uses a range bearing and tilt sensor to locate and scan the bucket between the lift and dump stages of the dragline cycle. Various scanning strategies are investigated for their benefits in this real-time application. The bucket is segmented from the scene using cluster analysis while the pose of the bucket is calculated using the iterative closest point (ICP) algorithm. Payload points are segmented from the bucket by a fixed distance neighbour clustering method to preserve boundary points and exclude low density clusters introduced by overhead chains and the spreader bar. A height grid is then used to represent the payload from which the volume can be calculated by summing over the grid cells. We show volume calculated on a scaled system with an accuracy of greater than 95 per cent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Franchised convenience stores successfully operate throughout Taiwan, but the convenience store market is approaching saturation point. Creating a cooperative long-term franchising relationship between franchisors and franchisees is essential to maintain the proportion of convenience stores...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the face of increasing concern over global warming and climate change, interest in the utilizzation of solar energy for building operations is rapidly growing. In this entry, the importance of using renewable energy in building operations is first introduced. This is followed by a general overview on the energy from the sun and the methods to utilize solar energy. Possible applications of solar energy in building operations are then discussed, which include the use of solar energy in the forms of daylighting, hot water heating, space heating and cooling, and building-integrated photovoltaics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Object identification and tracking have become critical for automated on-site construction safety assessment. The primary objective of this paper is to present the development of a testbed to analyze the impact of object identification and tracking errors caused by data collection devices and algorithms used for safety assessment. The testbed models workspaces for earthmoving operations and simulates safety-related violations, including speed limit violations, access violations to dangerous areas, and close proximity violations between heavy machinery. Three different cases were analyzed based on actual earthmoving operations conducted at a limestone quarry. Using the testbed, the impacts of device and algorithm errors were investigated for safety planning purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Short overview of the VACCHO Social Determinants Research Forum (2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regardless of technology benefits, safety planners still face difficulties explaining errors related to the use of different technologies and evaluating how the errors impact the performance of safety decision making. This paper presents a preliminary error impact analysis testbed to model object identification and tracking errors caused by image-based devices and algorithms and to analyze the impact of the errors for spatial safety assessment of earthmoving and surface mining activities. More specifically, this research designed a testbed to model workspaces for earthmoving operations, to simulate safety-related violations, and to apply different object identification and tracking errors on the data collected and processed for spatial safety assessment. Three different cases were analyzed based on actual earthmoving operations conducted at a limestone quarry. Using the testbed, the impacts of the errors were investigated for the safety planning purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While using unmanned systems in combat is not new, what will be new in the foreseeable future is how such systems are used and integrated in the civilian space. The potential use of Unmanned Aerial Vehicles in civil and commercial applications is becoming a fact, and is receiving considerable attention by industry and the research community. The majority of Unmanned Aerial Vehicles performing civilian tasks are restricted to flying only in segregated space, and not within the National Airspace. The areas that UAVs are restricted to flying in are typically not above populated areas, which in turn are the areas most useful for civilian applications. The reasoning behind the current restrictions is mainly due to the fact that current UAV technologies are not able to demonstrate an Equivalent Level of Safety to manned aircraft, particularly in the case of an engine failure which would require an emergency or forced landing. This chapter will preset and guide the reader through a number of developments that would facilitate the integration of UAVs into the National Airspace. Algorithms for UAV Sense-and-Avoid and Force Landings are recognized as two major enabling technologies that will allow the integration of UAVs in the civilian airspace. The following sections will describe some of the techniques that are currently being tested at the Australian Research Centre for Aerospace Automation (ARCAA), which places emphasis on the detection of candidate landing sites using computer vision, the planning of the descent path trajectory for the UAV, and the decision making process behind the selection of the final landing site.