962 resultados para Open network
Resumo:
This paper presents the design of μAV, a palm size open source micro quadrotor constructed on a single Printed Circuit Board. The aim of the micro quadrotor is to provide a lightweight (approximately 86g) and cheap robotic research platform that can be used for a range of robotic applications. One possible application could be a cheap test bed for robotic swarm research. The goal of this paper is to give an overview of the design and capabilities of the micro quadrotor. The micro quadrotor is complete with a 9 Degree of Freedom Inertial Measurement Unit, a Gumstix Overo® Computer-On-Module which can run the widely used Robot Operating System (ROS) for use with other research algorithms.
Resumo:
Numerical simulations of thermomagnetic convection of paramagnetic fluids placed in a micro-gravity condition (g ≈ 0) and under a uniform vertical gradient magnetic field in an open ended square enclosure with ramp heating temperature condition applied on a vertical wall is investigated in this study. In presence of the strong magnetic gradient field thermal convection of the paramagnetic fluid might take place even in a zero-gravity environment as a direct consequence of temperature differences occurring within the fluid. The thermal boundary layer develops adjacent to the hot wall as soon as the ramp temperature condition is applied on it. There are two scenarios can be observed based on the ramp heating time. The steady state of the thermal boundary layer can be reached before the ramp time is finished or vice versa. If the ramp time is larger than the quasi-steady time then the thermal boundary layer is in a quasi-steady mode with convection balancing conduction after the quasi-steady time. Further increase of the heat input simply accelerates the flow to maintain the proper thermal balance. Finally, the boundary layer becomes completely steady state when the ramp time is finished. Effects of magnetic Rayleigh number, Prandtl number and paramagnetic fluid parameter on the flow pattern and heat transfer are presented.
Resumo:
This recent decision of the New South Wales Court of Appeal considers the scope of the parens patriae jurisdiction in cases where the jurisdiction is invoked for the protection of a Gillick competent minor. As outlined below, in certain circumstances the law recognises that mature minors are able to make their own decisions concerning medical treatment. However, there have been a number of Commonwealth decisions which have addressed the issue of whether mature minors are able to refuse medical procedures in circumstances where refusal will result in the minor dying. Ultimately, this case confirms that the minor does not necessarily have a right to make autonomous decisions; the minor’s right to exercise his or her autonomous decision only exists when such decision accords with what is deemed to be in his or her best interests.
Resumo:
This study presents the largest-known, investigation on discomfort glare with 493 surveys collected from five green buildings in Brisbane, Australia. The study was conducted on full-time employees, working under their everyday lighting conditions, all of whom had no affiliation with the research institution. The survey consisted of a specially tailored questionnaire to assess potential factors relating to discomfort glare. Luminance maps extracted from high dynamic range (HDR) images were used to capture the luminous environment of the occupants. Occupants who experienced glare on their monitor and/or electric glare were excluded from analysis leaving 419 available surveys. Occupants were more sensitive to glare than any of the tested indices accounted for. A new index, the UGP was developed to take into account the scope of results in the investigation. The index is based on a linear transformation of the UGR to calculate a probability of disturbed persons. However all glare indices had some correlation to discomfort, and statistically there was no difference between the DGI, UGR and CGI. The UGP broadly reflects the demographics of the working population in Australia and the new index is applicable to open plan green buildings.
Resumo:
The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research examining effects of uncertainties of generic WSN platform and verifying the capability of SHM-oriented WSNs, particularly on demanding SHM applications like modal analysis and damage identification of real civil structures. This article first reviews the major technical uncertainties of both generic and SHM-oriented WSN platforms and efforts of SHM research community to cope with them. Then, effects of the most inherent WSN uncertainty on the first level of a common Output-only Modal-based Damage Identification (OMDI) approach are intensively investigated. Experimental accelerations collected by a wired sensory system on a benchmark civil structure are initially used as clean data before being contaminated with different levels of data pollutants to simulate practical uncertainties in both WSN platforms. Statistical analyses are comprehensively employed in order to uncover the distribution pattern of the uncertainty influence on the OMDI approach. The result of this research shows that uncertainties of generic WSNs can cause serious impact for level 1 OMDI methods utilizing mode shapes. It also proves that SHM-WSN can substantially lessen the impact and obtain truly structural information without having used costly computation solutions.
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Access to commercial destinations within the neighbourhood and walking among Australian older adults
Resumo:
BACKGROUND: Physical activity, particularly walking, is greatly beneficial to health; yet a sizeable proportion of older adults are insufficiently active. The importance of built environment attributes for walking is known, but few studies of older adults have examined neighbourhood destinations and none have investigated access to specific, objectively-measured commercial destinations and walking. METHODS: We undertook a secondary analysis of data from the Western Australian state government's health surveillance survey for those aged 65--84 years and living in the Perth metropolitan region from 2003--2009 (n = 2,918). Individual-level road network service areas were generated at 400 m and 800 m distances, and the presence or absence of six commercial destination types within the neighbourhood service areas identified (food retail, general retail, medical care services, financial services, general services, and social infrastructure). Adjusted logistic regression models examined access to and mix of commercial destination types within neighbourhoods for associations with self-reported walking behaviour. RESULTS: On average, the sample was aged 72.9 years (SD = 5.4), and was predominantly female (55.9%) and married (62.0%). Overall, 66.2% reported some weekly walking and 30.8% reported sufficient walking (>=150 min/week). Older adults with access to general services within 400 m (OR = 1.33, 95% CI = 1.07-1.66) and 800 m (OR = 1.20, 95% CI = 1.02-1.42), and social infrastructure within 800 m (OR = 1.19, 95% CI = 1.01-1.40) were more likely to engage in some weekly walking. Access to medical care services within 400 m (OR = 0.77, 95% CI = 0.63-0.93) and 800 m (OR = 0.83, 95% CI = 0.70-0.99) reduced the odds of sufficient walking. Access to food retail, general retail, financial services, and the mix of commercial destination types within the neighbourhood were all unrelated to walking. CONCLUSIONS: The types of neighbourhood commercial destinations that encourage older adults to walk appear to differ slightly from those reported for adult samples. Destinations that facilitate more social interaction, for example eating at a restaurant or church involvement, or provide opportunities for some incidental social contact, for example visiting the pharmacy or hairdresser, were the strongest predictors for walking among seniors in this study. This underscores the importance of planning neighbourhoods with proximate access to social infrastructure, and highlights the need to create residential environments that support activity across the life course.
Resumo:
Agent-based modelling (ABM), like other modelling techniques, is used to answer specific questions from real world systems that could otherwise be expensive or impractical. Its recent gain in popularity can be attributed to some degree to its capacity to use information at a fine level of detail of the system, both geographically and temporally, and generate information at a higher level, where emerging patterns can be observed. This technique is data-intensive, as explicit data at a fine level of detail is used and it is computer-intensive as many interactions between agents, which can learn and have a goal, are required. With the growing availability of data and the increase in computer power, these concerns are however fading. Nonetheless, being able to update or extend the model as more information becomes available can become problematic, because of the tight coupling of the agents and their dependence on the data, especially when modelling very large systems. One large system to which ABM is currently applied is the electricity distribution where thousands of agents representing the network and the consumers’ behaviours are interacting with one another. A framework that aims at answering a range of questions regarding the potential evolution of the grid has been developed and is presented here. It uses agent-based modelling to represent the engineering infrastructure of the distribution network and has been built with flexibility and extensibility in mind. What distinguishes the method presented here from the usual ABMs is that this ABM has been developed in a compositional manner. This encompasses not only the software tool, which core is named MODAM (MODular Agent-based Model) but the model itself. Using such approach enables the model to be extended as more information becomes available or modified as the electricity system evolves, leading to an adaptable model. Two well-known modularity principles in the software engineering domain are information hiding and separation of concerns. These principles were used to develop the agent-based model on top of OSGi and Eclipse plugins which have good support for modularity. Information regarding the model entities was separated into a) assets which describe the entities’ physical characteristics, and b) agents which describe their behaviour according to their goal and previous learning experiences. This approach diverges from the traditional approach where both aspects are often conflated. It has many advantages in terms of reusability of one or the other aspect for different purposes as well as composability when building simulations. For example, the way an asset is used on a network can greatly vary while its physical characteristics are the same – this is the case for two identical battery systems which usage will vary depending on the purpose of their installation. While any battery can be described by its physical properties (e.g. capacity, lifetime, and depth of discharge), its behaviour will vary depending on who is using it and what their aim is. The model is populated using data describing both aspects (physical characteristics and behaviour) and can be updated as required depending on what simulation is to be run. For example, data can be used to describe the environment to which the agents respond to – e.g. weather for solar panels, or to describe the assets and their relation to one another – e.g. the network assets. Finally, when running a simulation, MODAM calls on its module manager that coordinates the different plugins, automates the creation of the assets and agents using factories, and schedules their execution which can be done sequentially or in parallel for faster execution. Building agent-based models in this way has proven fast when adding new complex behaviours, as well as new types of assets. Simulations have been run to understand the potential impact of changes on the network in terms of assets (e.g. installation of decentralised generators) or behaviours (e.g. response to different management aims). While this platform has been developed within the context of a project focussing on the electricity domain, the core of the software, MODAM, can be extended to other domains such as transport which is part of future work with the addition of electric vehicles.
Resumo:
Global awareness for cleaner and renewable energy is transforming the electricity sector at many levels. New technologies are being increasingly integrated into the electricity grid at high, medium and low voltage levels, new taxes on carbon emissions are being introduced and individuals can now produce electricity, mainly through rooftop photovoltaic (PV) systems. While leading to improvements, these changes also introduce challenges, and a question that often rises is ‘how can we manage this constantly evolving grid?’ The Queensland Government and Ergon Energy, one of the two Queensland distribution companies, have partnered with some Australian and German universities on a project to answer this question in a holistic manner. The project investigates the impact the integration of renewables and other new technologies has on the physical structure of the grid, and how this evolving system can be managed in a sustainable and economical manner. To aid understanding of what the future might bring, a software platform has been developed that integrates two modelling techniques: agent-based modelling (ABM) to capture the characteristics of the different system units accurately and dynamically, and particle swarm optimization (PSO) to find the most economical mix of network extension and integration of distributed generation over long periods of time. Using data from Ergon Energy, two types of networks (3 phase, and Single Wired Earth Return or SWER) have been modelled; three-phase networks are usually used in dense networks such as urban areas, while SWER networks are widely used in rural Queensland. Simulations can be performed on these networks to identify the required upgrades, following a three-step process: a) what is already in place and how it performs under current and future loads, b) what can be done to manage it and plan the future grid and c) how these upgrades/new installations will perform over time. The number of small-scale distributed generators, e.g. PV and battery, is now sufficient (and expected to increase) to impact the operation of the grid, which in turn needs to be considered by the distribution network manager when planning for upgrades and/or installations to stay within regulatory limits. Different scenarios can be simulated, with different levels of distributed generation, in-place as well as expected, so that a large number of options can be assessed (Step a). Once the location, sizing and timing of assets upgrade and/or installation are found using optimisation techniques (Step b), it is possible to assess the adequacy of their daily performance using agent-based modelling (Step c). One distinguishing feature of this software is that it is possible to analyse a whole area at once, while still having a tailored solution for each of the sub-areas. To illustrate this, using the impact of battery and PV can have on the two types of networks mentioned above, three design conditions can be identified (amongst others): · Urban conditions o Feeders that have a low take-up of solar generators, may benefit from adding solar panels o Feeders that need voltage support at specific times, may be assisted by installing batteries · Rural conditions - SWER network o Feeders that need voltage support as well as peak lopping may benefit from both battery and solar panel installations. This small example demonstrates that no single solution can be applied across all three areas, and there is a need to be selective in which one is applied to each branch of the network. This is currently the function of the engineer who can define various scenarios against a configuration, test them and iterate towards an appropriate solution. Future work will focus on increasing the level of automation in identifying areas where particular solutions are applicable.
Resumo:
An Artificial Neural Network (ANN) is a computational modeling tool which has found extensive acceptance in many disciplines for modeling complex real world problems. An ANN can model problems through learning by example, rather than by fully understanding the detailed characteristics and physics of the system. In the present study, the accuracy and predictive power of an ANN was evaluated in predicting kinetic viscosity of biodiesels over a wide range of temperatures typically encountered in diesel engine operation. In this model, temperature and chemical composition of biodiesel were used as input variables. In order to obtain the necessary data for model development, the chemical composition and temperature dependent fuel properties of ten different types of biodiesels were measured experimentally using laboratory standard testing equipments following internationally recognized testing procedures. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture was optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the absolute fraction of variance (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found that ANN is highly accurate in predicting the viscosity of biodiesel and demonstrates the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties at different temperature levels. Therefore the model developed in this study can be a useful tool in accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific sub-regions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.
Resumo:
Synaptic changes at sensory inputs to the dorsal nucleus of the lateral amygdala (LAd) play a key role in the acquisition and storage of associative fear memory. However, neither the temporal nor spatial architecture of the LAd network response to sensory signals is understood. We developed a method for the elucidation of network behavior. Using this approach, temporally patterned polysynaptic recurrent network responses were found in LAd (intra-LA), both in vitro and in vivo, in response to activation of thalamic sensory afferents. Potentiation of thalamic afferents resulted in a depression of intra-LA synaptic activity, indicating a homeostatic response to changes in synaptic strength within the LAd network. Additionally, the latencies of thalamic afferent triggered recurrent network activity within the LAd overlap with known later occurring cortical afferent latencies. Thus, this recurrent network may facilitate temporal coincidence of sensory afferents within LAd during associative learning.
Resumo:
This case study examines the way in which Knowledge Unlatched is combining collective action and open access licenses to encourage innovation in markets for specialist academic books. Knowledge Unlatched is a not for profit organisation that has been established to help a global community of libraries coordinate their book purchasing activities more effectively and, in so doing, to ensure that books librarians select for their own collections become available for free for anyone in the world to read. The Knowledge Unlatched model is an attempt to re-coordinate a market in order to facilitate a transition to digitally appropriate publishing models that include open access. It offers librarians an opportunity to facilitate the open access publication of books that their own readers would value access to. It provides publishers with a stable income stream on titles selected by libraries, as well as an ability to continue selling books to a wider market on their own terms. Knowledge Unlatched provides a rich case study for researchers and practitioners interested in understanding how innovations in procurement practices can be used to stimulate more effective, equitable markets for socially valuable products.
Resumo:
Although digital technology has made it possible for many more people to access content at no extra cost, fewer people than ever before are able to read the books written by university-based researchers. This workshop explores the role that open access licenses and collective action might play in reviving the scholarly monograph: a specialised area of academic publishing that has seen sales decline by more than 90 per cent over the past three decades. It also introduces Knowledge Unlatched an ambitious attempt to create an internationally coordinated, sustainable route to open access for scholarly books. Knowledge Unlatched is now in its pilot phase.
Resumo:
Fierce debates have characterised 2013 as the implications of government mandates for open access have been debated and pathways for implementation worked out. There is no doubt that journals will move to a mix of gold and green and there will be an unsettled relationship between the two. But what of books? Is it conceivable that in those subjects, such as in the humanities and social sciences, where something longer than the journal article is still the preferred form of scholarly communications that these will stay closed? Will it be acceptable to have some publicly funded research made available only in closed book form (regardless of whether print or digital) while other subjects where articles are favoured go open access? Frances Pinter is in the middle of these debates, having founded Knowledge Unlatched (see www.knowledgeunlatched.org). KU is a global library consortium enabling open access books. Knowledge Unlatched is helping libraries to work together for a sustainable open future for specialist academic books. Its vision is a healthy market that includes free access for end users. In this session she will review all the different models that are being experimented with around the world. These include author-side payments, institutional subsidies, research funding body approaches etc. She will compare and contrast these models with those that are already in place for journal articles. She will also review the policy landscape and report on how open access scholarly books are faring to date Frances Pinter, Founder, Knowledge Unlatched, UK