973 resultados para Farming systems
Resumo:
We advocate for the use of predictive techniques in interactive computer music systems. We suggest that the inclusion of prediction can assist in the design of proactive rather than reactive computational performance partners. We summarize the significant role prediction plays in human musical decisions, and the only modest use of prediction in interactive music systems to date. After describing how we are working toward employing predictive processes in our own metacreation software we reflect on future extensions to these approaches.
Resumo:
The Beyond Compliance project, which began in July 2011 with funding from the Standards and Trade Development Facility for 2 years, aims to enhance competency and confidence in the South East Asian sub-region by applying a Systems Approach for pest risk management. The Systems Approach involves the use of integrated measures, at least two of which are independent, that cumulatively reduce the risk of introducing exotic pests through trade. Although useful in circumstances where single measures are inappropriate or unavailable, the Systems Approach is inherently more complicated than single-measure approaches, which may inhibit its uptake. The project methodology is to take prototype decision-support tools, such as Control Point-Bayesian Networks (CP-BN), developed in recent plant health initiatives in other regions, including the European PRATIQUE project, and to refine them within this sub-regional context. Case studies of high-priority potential agricultural trade will be conducted by National Plant Protection Organizations of participating South East Asian countries in trials of the tools, before further modifications. Longer term outcomes may include: more robust pest risk management in the region (for exports and imports); greater inclusion of stakeholders in development of pest risk management plans; increased confidence in trade negotiations; and new opportunities for trade.
Resumo:
During the last three decades, restorative justice has emerged in numerous localities around the world as an accepted approach to responding to crime. This article, which stems from a doctoral study on the history of restorative justice, provides a critical analysis of accepted histories of restorative practices. It revisits the celebrated historical texts of the restorative justice movement, and re-evaluates their contribution to the emergence of restorative justice measures. It traces the emergence of the term 'restorative justice', and reveals that it emerged in much earlier writings than is commonly thought to be the case by scholars in the restorative justice field. It also briefly considers some 'power struggles' in relation to producing an accepted version of the history of restorative justice, and scholars' attempts to 'rewrite history' to align with current views on restorative justice. Finally, this article argues that some histories of restorative justice selectively and inaccurately portray key figures from the history of criminology as restorative justice supporters. This, it is argued, gives restorative justice a false lineage and operates to legitimise the widespread adoption of restorative justice around the globe.
Resumo:
This paper presents a new framework for distributed intrusion detection based on taint marking. Our system tracks information flows between applications of multiple hosts gathered in groups (i.e., sets of hosts sharing the same distributed information flow policy) by attaching taint labels to system objects such as files, sockets, Inter Process Communication (IPC) abstractions, and memory mappings. Labels are carried over the network by tainting network packets. A distributed information flow policy is defined for each group at the host level by labeling information and defining how users and applications can legally access, alter or transfer information towards other trusted or untrusted hosts. As opposed to existing approaches, where information is most often represented by two security levels (low/high, public/private, etc.), our model identifies each piece of information within a distributed system, and defines their legal interaction in a fine-grained manner. Hosts store and exchange security labels in a peer to peer fashion, and there is no central monitor. Our IDS is implemented in the Linux kernel as a Linux Security Module (LSM) and runs standard software on commodity hardware with no required modification. The only trusted code is our modified operating system kernel. We finally present a scenario of intrusion in a web service running on multiple hosts, and show how our distributed IDS is able to report security violations at each host level.
Resumo:
1. Expert knowledge continues to gain recognition as a valuable source of information in a wide range of research applications. Despite recent advances in defining expert knowledge, comparatively little attention has been given to how to view expertise as a system of interacting contributory factors, and thereby, to quantify an individual’s expertise. 2. We present a systems approach to describing expertise that accounts for many contributing factors and their interrelationships, and allows quantification of an individual’s expertise. A Bayesian network (BN) was chosen for this purpose. For the purpose of illustration, we focused on taxonomic expertise. The model structure was developed in consultation with professional taxonomists. The relative importance of the factors within the network were determined by a second set of senior taxonomists. This second set of experts (i.e. supra-experts) also provided validation of the model structure. Model performance was then assessed by applying the model to hypothetical career states in the discipline of taxonomy. Hypothetical career states were used to incorporate the greatest possible differences in career states and provide an opportunity to test the model against known inputs. 3. The resulting BN model consisted of 18 primary nodes feeding through one to three higher-order nodes before converging on the target node (Taxonomic Expert). There was strong consistency among node weights provided by the supra-experts for some nodes, but not others. The higher order nodes, “Quality of work” and “Total productivity”, had the greatest weights. Sensitivity analysis indicated that although some factors had stronger influence in the outer nodes of the network, there was relatively equal influence of the factors leading directly into the target node. Despite differences in the node weights provided by our supra-experts, there was remarkably good agreement among assessments of our hypothetical experts that accurately reflected differences we had built into them. 4. This systems approach provides a novel way of assessing the overall level of expertise of individuals, accounting for multiple contributory factors, and their interactions. Our approach is adaptable to other situations where it is desirable to understand components of expertise.
Resumo:
IEEE 802.11p is the new standard for Inter-Vehicular Communications (IVC) using the 5.9 GHz frequency band, as part of the DSRC framework; it will enable applications based on Cooperative Systems. Simulation is widely used to estimate or verify the potential benefits of such cooperative applications, notably in terms of safety for the drivers. We have developed a performance model for 802.11p that can be used by simulations of cooperative applications (e.g. collision avoidance) without requiring intricate models of the whole IVC stack. Instead, it provide a a straightforward yet realistic modelisation of IVC performance. Our model uses data from extensive field trials to infer the correlation between speed, distance and performance metrics such as maximum range, latency and frame loss. Then, we improve this model to limit the number of profiles that have to be generated when there are more than a few couples of emitter-receptor in a given location. Our model generates realistic performance for rural or suburban environments among small groups of IVC-equipped vehicles and road side units.
Resumo:
Digital Human Models (DHM) have been used for over 25 years. They have evolved from simple drawing templates, which are nowadays still used in architecture, to complex and Computer Aided Engineering (CAE) integrated design and analysis tools for various ergonomic tasks. DHM are most frequently used for applications in product design and production planning, with many successful implementations documented. DHM from other domains, as for example computer user interfaces, artificial intelligence, training and education, or the entertainment industry show that there is also an ongoing development towards a comprehensive understanding and holistic modeling of human behavior. While the development of DHM for the game sector has seen significant progress in recent years, advances of DHM in the area of ergonomics have been comparatively modest. As a consequence, we need to question if current DHM systems are fit for the design of future mobile work systems. So far it appears that DHM in Ergonomics are rather limited to some traditional applications. According to Dul et al. (2012), future characteristics of Human Factors and Ergonomics (HFE) can be assigned to six main trends: (1) global change of work systems, (2) cultural diversity, (3) ageing, (4) information and communication technology (ICT), (5) enhanced competiveness and the need for innovation, and; (6) sustainability and corporate social responsibility. Based on a literature review, we systematically investigate the capabilities of current ergonomic DHM systems versus the ‘Future of Ergonomics’ requirements. It is found that DHMs already provide broad functionality in support of trends (1) and (2), and more limited options in regards to trend (3). Today’s DHM provide access to a broad range of national and international databases for correct differentiation and characterization of anthropometry for global populations. Some DHM explicitly address social and cultural modeling of groups of people. In comparison, the trends of growing importance of ICT (4), the need for innovation (5) and sustainability (6) are addressed primarily from a hardware-oriented and engineering perspective and not reflected in DHM. This reflects a persistent separation between hardware design (engineering) and software design (information technology) in the view of DHM – a disconnection which needs to be urgently overcome in the era of software defined user interfaces and mobile devices. The design of a mobile ICT-device is discussed to exemplify the need for a comprehensive future DHM solution. Designing such mobile devices requires an approach that includes organizational aspects as well as technical and cognitive ergonomics. Multiple interrelationships between the different aspects result in a challenging setting for future DHM. In conclusion, the ‘Future of Ergonomics’ pose particular challenges for DHM in regards to the design of mobile work systems, and moreover mobile information access.
Resumo:
Background The incidence of malignant mesothelioma is increasing. There is the perception that survival is worse in the UK than in other countries. However, it is important to compare survival in different series based on accurate prognostic data. The European Organisation for Research and Treatment of Cancer (EORTC) and the Cancer and Leukaemia Group B (CALGB) have recently published prognostic scoring systems. We have assessed the prognostic variables, validated the EORTC and CALGB prognostic groups, and evaluated survival in a series of 142 patients. Methods Case notes of 142 consecutive patients presenting in Leicester since 1988 were reviewed. Univariate analysis of prognostic variables was performed using a Cox proportional hazards regression model. Statistically significant variables were analysed further in a forward, stepwise multivariate model. EORTC and CALGB prognostic groups were derived, Kaplan-Meier survival curves plotted, and survival rates were calculated from life tables. Results Significant poor prognostic factors in univariate analysis included male sex, older age, weight loss, chest pain, poor performance status, low haemoglobin, leukocytosis, thrombocytosis, and non-epithelial cell type (p<0.05). The prognostic significance of cell type, haemoglobin, white cell count, performance status, and sex were retained in the multivariate model. Overall median survival was 5.9 (range 0-34.3) months. One and two year survival rates were 21.3% (95% CI 13.9 to 28.7) and 3.5% (0 to 8.5), respectively. Median, one, and two year survival data within prognostic groups in Leicester were equivalent to the EORTC and CALGB series. Survival curves were successfully stratified by the prognostic groups. Conclusions This study validates the EORTC and CALGB prognostic scoring systems which should be used both in the assessment of survival data of series in different countries and in the stratification of patients into randomised clinical studies.
Resumo:
Capacity of current and future high data rate wireless communications depend significantly on how well changes in the wireless channel are predicted and tracked. Generally, this can be estimated by transmitting known symbols. However, this increases overheads if the channel varies over time. Given today’s bandwidth demand and the increased necessity for mobile wireless devices, the contributions of this research are very significant. This study has developed a novel and efficient channel tracking algorithm that can recursively update the channel estimation for wireless broadband communications reducing overheads, therefore increasing the speed of wireless communication systems.
Resumo:
With the implementation of the Personally Controlled eHealth Records system (PCEHR) in Australia, shared Electronic Health Records (EHR) are now a reality. However, the characteristic implicit in the PCEHR that puts the consumer (i.e. patient) in control of managing his or her health information within the PCEHR prevents healthcare professionals (HCPs) from utilising it as a one-stop-shop for information at point of care decision making as they cannot trust that a complete record of the consumer's health history is available to them through it. As a result, whilst reaching a major milestone in Australia's eHealth journey, the PCEHR does not reap the full benefits that such a shared EHR system can offer.
Resumo:
This thesis presents an analysis of the resource allocation problem in Orthogonal Frequency Division Multiplexing based multi-hop wireless communications systems. The study analyzed the tractable nature of the problem and designed several heuristic and fairness-aware resource allocation algorithms. These algorithms are fast and efficient and therefore can improve power management in wireless systems significantly.