587 resultados para Multi-path mitigation
Resumo:
Multi-resolution modelling has become essential as modern 3D applications demand 3D objects with higher LODs (LOD). Multi-modal devices such as PDAs and UMPCs do not have sufficient resources to handle the original 3D objects. The increased usage of collaborative applications has created many challenges for remote manipulation working with 3D objects of different quality. This paper studies how we can improve multi-resolution techniques by performing multiedge decimation and using annotative commands. It also investigates how devices with poorer quality 3D object can participate in collaborative actions.
Resumo:
Cooperative collision warning system for road vehicles, enabled by recent advances in positioning systems and wireless communication technologies, can potentially reduce traffic accident significantly. To improve the system, we propose a graph model to represent interactions between multiple road vehicles in a specific region and at a specific time. Given a list of vehicles in vicinity, we can generate the interaction graph using several rules that consider vehicle's properties such as position, speed, heading, etc. Safety applications can use the model to improve emergency warning accuracy and optimize wireless channel usage. The model allows us to develop some congestion control strategies for an efficient multi-hop broadcast protocol.
Resumo:
Background Diagnosis and treatment of cancer can contribute to psychological distress and anxiety amongst patients. Evidence indicates that information giving can be beneficial in reducing patient anxiety, so oncology specific information may have a major impact on this patient group. This study investigates the effects of an orientation program on levels of anxiety and self-efficacy amongst newly registered cancer patients who are about to undergo chemotherapy and/or radiation therapy in the cancer care centre of a large tertiary Australian hospital. Methods The concept of interventions for orienting new cancer patients needs revisiting due to the dynamic health care system. Historically, most orientation programs at this cancer centre were conducted by one nurse. A randomised controlled trial has been designed to test the effectiveness of an orientation program with bundled interventions; a face-to-face program which includes introduction to the hospital facilities, introduction to the multi-disciplinary team and an overview of treatment side effects and self care strategies. The aim is to orientate patients to the cancer centre and to meet the health care team. We hypothesize that patients who receive this orientation will experience lower levels of anxiety and distress, and a higher level of self-efficacy. Discussion An orientation program is a common health care service provided by cancer care centres for new cancer patients. Such programs aim to give information to patients at the beginning of their encounter at a cancer care centre. It is clear in the literature that interventions that aim to improve self-efficacy in patients may demonstrate potential improvement in health outcomes. Yet, evidence on the effects of orientation programs for cancer patients on self-efficacy remains scarce, particularly with respect to the use of multidisciplinary team members. This paper presents the design of a randomised controlled trial that will evaluate the effects and feasibility of a multidisciplinary orientation program for new cancer patients.
Resumo:
The identification of attractors is one of the key tasks in studies of neurobiological coordination from a dynamical systems perspective, with a considerable body of literature resulting from this task. However, with regards to typical movement models investigated, the overwhelming majority of actions studied previously belong to the class of continuous, rhythmical movements. In contrast, very few studies have investigated coordination of discrete movements, particularly multi-articular discrete movements. In the present study, we investigated phase transition behavior in a basketball throwing task where participants were instructed to shoot at the basket from different distances. Adopting the ubiquitous scaling paradigm, throwing distance was manipulated as a candidate control parameter. Using a cluster analysis approach, clear phase transitions between different movement patterns were observed in performance of only two of eight participants. The remaining participants used a single movement pattern and varied it according to throwing distance, thereby exhibiting hysteresis effects. Results suggested that, in movement models involving many biomechanical degrees of freedom in degenerate systems, greater movement variation across individuals is available for exploitation. This observation stands in contrast to movement variation typically observed in studies using more constrained bi-manual movement models. This degenerate system behavior provides new insights and poses fresh challenges to the dynamical systems theoretical approach, requiring further research beyond conventional movement models.
Resumo:
This paper describes the current status of a program to develop an automated forced landing system for a fixed-wing Unmanned Aerial Vehicle (UAV). This automated system seeks to emulate human pilot thought processes when planning for and conducting an engine-off emergency landing. Firstly, a path planning algorithm that extends Dubins curves to 3D space is presented. This planning element is then combined with a nonlinear guidance and control logic, and simulated test results demonstrate the robustness of this approach to strong winds during a glided descent. The average path deviation errors incurred are comparable to or even better than that of manned, powered aircraft. Secondly, a study into suitable multi-criteria decision making approaches and the problems that confront the decision-maker is presented. From this study, it is believed that decision processes that utilize human expert knowledge and fuzzy logic reasoning are most suited to the problem at hand, and further investigations will be conducted to identify the particular technique/s to be implemented in simulations and field tests. The automated UAV forced landing approach presented in this paper is promising, and will allow the progression of this technology from the development and simulation stages through to a prototype system
Resumo:
One of the ways in which university departments and faculties can enhance the quality of learning and assessment is to develop a ‘well thought out criterion‐referenced assessment system’ (Biggs, 2003, p. 271). In designing undergraduate degrees (courses) this entails making decisions about the levelling of expectations across different years through devising objectives and their corresponding criteria and standards: a process of alignment analogous to what happens in unit (subject) design. These decisions about levelling have important repercussions in terms of supporting students’ work‐related learning, especially in relation to their ability to cope with the increasing cognitive and skill demands made on them as they progress through their studies. They also affect the accountability of teacher judgments of students’ responses to assessment tasks, achievement of unit objectives and, ultimately, whether students are awarded their degrees and are sufficiently prepared for the world of work. Research reveals that this decision‐making process is rarely underpinned by an explicit educational rationale (Morgan et al, 2002). The decision to implement criterion referenced assessment in an undergraduate microbiology degree was the impetus for developing such a rationale because of the implications for alignment, and therefore ‘levelling’ of expectations across different years of the degree. This paper provides supporting evidence for a multi‐pronged approach to levelling, through backward mapping of two revised units (foundation and exit year). This approach adheres to the principles of alignment while combining a work‐related approach (via industry input) with the blended disciplinary and learner‐centred approaches proposed by Morgan et al. (2002). It is suggested that this multi‐pronged approach has the potential for making expectations, especially work‐related ones across different year levels of degrees, more explicit to students and future employers.
Resumo:
Classical negotiation models are weak in supporting real-world business negotiations because these models often assume that the preference information of each negotiator is made public. Although parametric learning methods have been proposed for acquiring the preference information of negotiation opponents, these methods suffer from the strong assumptions about the specific utility function and negotiation mechanism employed by the opponents. Consequently, it is difficult to apply these learning methods to the heterogeneous negotiation agents participating in e‑marketplaces. This paper illustrates the design, development, and evaluation of a nonparametric negotiation knowledge discovery method which is underpinned by the well-known Bayesian learning paradigm. According to our empirical testing, the novel knowledge discovery method can speed up the negotiation processes while maintaining negotiation effectiveness. To the best of our knowledge, this is the first nonparametric negotiation knowledge discovery method developed and evaluated in the context of multi-issue bargaining over e‑marketplaces.
Resumo:
This article describes the development and validation of a multi-dimensional scale for measuring managers’ perceptions of the range of factors that routinely guide their decision-making processes. An instrument for identifying managerial ethical profiles (MEP) is developed by measuring the perceived role of different ethical principles in the decision-making of managers. Evidence as to the validity of the multidimensionality of the ethical scale is provided, based on the comparative assessment of different models for managerial ethical decision-making. Confirmatory Factor Analysis (CFA) supported a eight-factor model including two factors for each of the main four schools of moral philosophy. Future research needs and the value of this measure to business ethics are discussed.
Resumo:
This paper reports the application of multicriteria decision making techniques, PROMETHEE and GAIA, and receptor models, PCA/APCS and PMF, to data from an air monitoring site located on the campus of Queensland University of Technology in Brisbane, Australia and operated by Queensland Environmental Protection Agency (QEPA). The data consisted of the concentrations of 21 chemical species and meteorological data collected between 1995 and 2003. PROMETHEE/GAIA separated the samples into those collected when leaded and unleaded petrol were used to power vehicles in the region. The number and source profiles of the factors obtained from PCA/APCS and PMF analyses were compared. There are noticeable differences in the outcomes possibly because of the non-negative constraints imposed on the PMF analysis. While PCA/APCS identified 6 sources, PMF reduced the data to 9 factors. Each factor had distinctive compositions that suggested that motor vehicle emissions, controlled burning of forests, secondary sulphate, sea salt and road dust/soil were the most important sources of fine particulate matter at the site. The most plausible locations of the sources were identified by combining the results obtained from the receptor models with meteorological data. The study demonstrated the potential benefits of combining results from multi-criteria decision making analysis with those from receptor models in order to gain insights into information that could enhance the development of air pollution control measures.
Resumo:
Purpose: The purpose of this paper is to analyse the risk management process conducted by some private and not-for-profit affordable housing providers in South East Queensland, and draw conclusions about the relationship between risk assessments/responses and past experiences.----- Design/methodology/approach: In-depth interviews of selected non-government housing providers have been conducted to facilitate an understanding of their approach to risk assessment in developing and in managing affordable housing projects. Qualitative data are analysed using thematic analysis to find emerging themes suggested by interview participants.----- Findings: The paper finds that informal risk management process is used as part of normal business process in accordance with industry standards. Many interviewees agree that the recognition of financial risk and the fear of community rejection of such housing projects have restrained them from committing to such investment projects. The levels of acceptance of risk are not always consistent across housing providers which create opportunities to conduct multi-stakeholder partnership to reduce overall risk.----- Research limitations/implications: The paper has implications for developers or investors who seek to include affordable housing as part of their portfolio. However, data collected in the study are a cross-section of interviews that will not include the impact on recent tax incentives offers by the Australian Commonwealth Government.----- Practical implications: The study suggests that implementing improvements to the risk mitigation and management framework may assist in promoting the supply of affordable housing by non-government providers.----- Originality/value: The focus of the study is the interaction between partnerships and risk management in development and management of affordable rental housing.
Resumo:
Tzeng et al. proposed a new threshold multi-proxy multi-signature scheme with threshold verification. In their scheme, a subset of original signers authenticates a designated proxy group to sign on behalf of the original group. A message m has to be signed by a subset of proxy signers who can represent the proxy group. Then, the proxy signature is sent to the verifier group. A subset of verifiers in the verifier group can also represent the group to authenticate the proxy signature. Subsequently, there are two improved schemes to eliminate the security leak of Tzeng et al.’s scheme. In this paper, we have pointed out the security leakage of the three schemes and further proposed a novel threshold multi-proxy multi-signature scheme with threshold verification.
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach,which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this paper we propose two approaches which measure multi-level association rules to help evaluate their interestingness. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
Association rule mining has made many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we firstly propose a definition for redundancy; then we propose a concise representation called Reliable basis for representing non-redundant association rules for both exact rules and approximate rules. An important contribution of this paper is that we propose to use the certainty factor as the criteria to measure the strength of the discovered association rules. With the criteria, we can determine the boundary between redundancy and non-redundancy to ensure eliminating as many redundant rules as possible without reducing the inference capacity of and the belief to the remaining extracted non-redundant rules. We prove that the redundancy elimination based on the proposed Reliable basis does not reduce the belief to the extracted rules. We also prove that all association rules can be deduced from the Reliable basis. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules.
Resumo:
Mobile robots are widely used in many industrial fields. Research on path planning for mobile robots is one of the most important aspects in mobile robots research. Path planning for a mobile robot is to find a collision-free route, through the robot’s environment with obstacles, from a specified start location to a desired goal destination while satisfying certain optimization criteria. Most of the existing path planning methods, such as the visibility graph, the cell decomposition, and the potential field are designed with the focus on static environments, in which there are only stationary obstacles. However, in practical systems such as Marine Science Research, Robots in Mining Industry, and RoboCup games, robots usually face dynamic environments, in which both moving and stationary obstacles exist. Because of the complexity of the dynamic environments, research on path planning in the environments with dynamic obstacles is limited. Limited numbers of papers have been published in this area in comparison with hundreds of reports on path planning in stationary environments in the open literature. Recently, a genetic algorithm based approach has been introduced to plan the optimal path for a mobile robot in a dynamic environment with moving obstacles. However, with the increase of the number of the obstacles in the environment, and the changes of the moving speed and direction of the robot and obstacles, the size of the problem to be solved increases sharply. Consequently, the performance of the genetic algorithm based approach deteriorates significantly. This motivates the research of this work. This research develops and implements a simulated annealing algorithm based approach to find the optimal path for a mobile robot in a dynamic environment with moving obstacles. The simulated annealing algorithm is an optimization algorithm similar to the genetic algorithm in principle. However, our investigation and simulations have indicated that the simulated annealing algorithm based approach is simpler and easier to implement. Its performance is also shown to be superior to that of the genetic algorithm based approach in both online and offline processing times as well as in obtaining the optimal solution for path planning of the robot in the dynamic environment. The first step of many path planning methods is to search an initial feasible path for the robot. A commonly used method for searching the initial path is to randomly pick up some vertices of the obstacles in the search space. This is time consuming in both static and dynamic path planning, and has an important impact on the efficiency of the dynamic path planning. This research proposes a heuristic method to search the feasible initial path efficiently. Then, the heuristic method is incorporated into the proposed simulated annealing algorithm based approach for dynamic robot path planning. Simulation experiments have shown that with the incorporation of the heuristic method, the developed simulated annealing algorithm based approach requires much shorter processing time to get the optimal solutions in the dynamic path planning problem. Furthermore, the quality of the solution, as characterized by the length of the planned path, is also improved with the incorporated heuristic method in the simulated annealing based approach for both online and offline path planning.
Resumo:
The study reported here, constitutes a full review of the major geological events that have influenced the morphological development of the southeast Queensland region. Most importantly, it provides evidence that the region’s physiography continues to be geologically ‘active’ and although earthquakes are presently few and of low magnitude, many past events and tectonic regimes continue to be strongly influential over drainage, morphology and topography. Southeast Queensland is typified by highland terrain of metasedimentary and igneous rocks that are parallel and close to younger, lowland coastal terrain. The region is currently situated in a passive margin tectonic setting that is now under compressive stress, although in the past, the region was subject to alternating extensional and compressive regimes. As part of the investigation, the effects of many past geological events upon landscape morphology have been assessed at multiple scales using features such as the location and orientation of drainage channels, topography, faults, fractures, scarps, cleavage, volcanic centres and deposits, and recent earthquake activity. A number of hypotheses for local geological evolution are proposed and discussed. This study has also utilised a geographic information system (GIS) approach that successfully amalgamates the various types and scales of datasets used. A new method of stream ordination has been developed and is used to compare the orientation of channels of similar orders with rock fabric, in a topologically controlled approach that other ordering systems are unable to achieve. Stream pattern analysis has been performed and the results provide evidence that many drainage systems in southeast Queensland are controlled by known geological structures and by past geological events. The results conclude that drainage at a fine scale is controlled by cleavage, joints and faults, and at a broader scale, large river valleys, such as those of the Brisbane River and North Pine River, closely follow the location of faults. These rivers appear to have become entrenched by differential weathering along these planes of weakness. Significantly, stream pattern analysis has also identified some ‘anomalous’ drainage that suggests the orientations of these watercourses are geologically controlled, but by unknown causes. To the north of Brisbane, a ‘coastal drainage divide’ has been recognized and is described here. The divide crosses several lithological units of different age, continues parallel to the coast and prevents drainage from the highlands flowing directly to the coast for its entire length. Diversion of low order streams away from the divide may be evidence that a more recent process may be the driving force. Although there is no conclusive evidence for this at present, it is postulated that the divide may have been generated by uplift or doming associated with mid-Cenozoic volcanism or a blind thrust at depth. Also north of Brisbane, on the D’Aguilar Range, an elevated valley (the ‘Kilcoy Gap’) has been identified that may have once drained towards the coast and now displays reversed drainage that may have resulted from uplift along the coastal drainage divide and of the D’Aguilar blocks. An assessment of the distribution and intensity of recent earthquakes in the region indicates that activity may be associated with ancient faults. However, recent movement on these faults during these events would have been unlikely, given that earthquakes in the region are characteristically of low magnitude. There is, however, evidence that compressive stress is building and being released periodically and ancient faults may be a likely place for this stress to be released. The relationship between ancient fault systems and the Tweed Shield Volcano has also been discussed and it is suggested here that the volcanic activity was associated with renewed faulting on the Great Moreton Fault System during the Cenozoic. The geomorphology and drainage patterns of southeast Queensland have been compared with expected morphological characteristics found at passive and other tectonic settings, both in Australia and globally. Of note are the comparisons with the East Brazilian Highlands, the Gulf of Mexico and the Blue Ridge Escarpment, for example. In conclusion, the results of the study clearly show that, although the region is described as a passive margin, its complex, past geological history and present compressive stress regime provide a more intricate and varied landscape than would be expected along typical passive continental margins. The literature review provides background to the subject and discusses previous work and methods, whilst the findings are presented in three peer-reviewed, published papers. The methods, hypotheses, suggestions and evidence are discussed at length in the final chapter.