938 resultados para ENVIRONMENTAL APPLICATIONS
Resumo:
If Australian scientists are to fully and actively participate in international scientific collaborations utilising online technologies, policies and laws must support the data access and reuse objectives of these projects. To date Australia lacks a comprehensive policy and regulatory framework for environmental information and data generally. Instead there exists a series of unconnected Acts that adopt historically-based, sector-specific approaches to the collection, use and reuse of environmental information. This paper sets out the findings of an analysis of a representative sample of Australian statutes relating to environmental management and protection to determine the extent to which they meet best practice criteria for access to and reuse of environmental information established in international initiatives. It identifies issues that need to be addressed in the legislation governing environmental information to ensure that Australian scientists are able to fully engage in international research collaborations.
Resumo:
The progress of technology has led to the increased adoption of energy monitors among household energy consumers. While the monitors available on the market deliver real-time energy usage feedback to the consumer, the format of this data is usually unengaging and mundane. Moreover, it fails to address consumers with different motivations and needs to save and compare energy. This paper presents a study that seeks to provide initial indications for motivation-specific design of energy-related feedback. We focus on comparative feedback supported by a community of energy consumers. In particular, we examine eco-visualisations, temporal self-comparison, norm comparison, one-on-one comparison and ranking, whereby the last three allow us to explore the potential of socialising energy-related feedback. These feedback types were integrated in EnergyWiz – a mobile application that enables users to compare with their past performance, neighbours, contacts from social networking sites and other EnergyWiz users. The application was evaluated in personal, semi-structured interviews, which provided first insights on how to design motivation-related comparative feedback.
Resumo:
The ability to reproducibly load bioactive molecules into polymeric microspheres is a challenge. Traditional microsphere fabrication methods typically provide inhomogeneous release profiles and suffer from lack of batch to batch reproducibility, hindering their potential to up-scale and their translation to the clinic. This deficit in homogeneity is in part attributed to broad size distributions and variability in the morphology of particles. It is thus desirable to control morphology and size of non-loaded particles in the first instance, in preparation for obtaining desired release profiles of loaded particles in the later stage. This is achieved by identifying the key parameters involved in particle production and understanding how adapting these parameters affects the final characteristics of particles. In this study, electrospraying was presented as a promising technique for generating reproducible particles made of polycaprolactone, a biodegradable, FDA-approved polymer. Narrow size distributions were obtained by the control of electrospraying flow rate and polymer concentration, with average particle sizes ranging from 10 to 20 um. Particles were shown to be spherical with a homogenous embossed texture, determined by the polymer entanglement regime taking place during electrospraying. No toxic residue was detected by this process based on preliminary cell work using DNA quantification assays, validating this method as suitable for further loading of bioactive components.
Resumo:
Throughout this workshop session we have looked at various configurations of Sage as well as using the Sage UI to run Sage applications (e.g. the image viewer). More advanced usage of Sage has been demonstrated using a Sage compatible version of Paraview highlighting the potential of parallel rendering. The aim of this tutorial session is to give a practical introduction to developing visual content for a tiled display using the Sage libraries. After completing this tutorial you should have the basic tools required to develop your own custom Sage applications. This tutorial is designed for software developers and intermediate programming knowledge is assumed, along with some introductory OpenGL . You will be required to write small portions of C/C++ code to complete this worksheet. However if you do not feel comfortable writing code (or have never written in C or C++), we will be on hand throughout this session so feel free to ask for some help. We have a number of machines in this lab running a VNC client to a virtual machine running Fedora 12. You should all be able to log in with the username “escience”, and password “escience10”. Some of the commands in this worksheet require you to run them as the root user, so note the password as you may need to use it a few times. If you need to access the Internet, then use the username “qpsf01”, password “escience10”
Resumo:
Osteoarthritis (OA) is a chronic, non-inflammatory type of arthritis, which usually affects the movable and weight bearing joints of the body. It is the most common joint disease in human beings and common in elderly people. Till date, there are no safe and effective diseases modifying OA drugs (DMOADs) to treat the millions of patients suffering from this serious and debilitating disease. However, recent studies provide strong evidence for the use of mesenchymal stem cell (MSC) therapy in curing cartilage related disorders. Due to their natural differentiation properties, MSCs can serve as vehicles for the delivery of effective, targeted treatment to damaged cartilage in OA disease. In vitro, MSCs can readily be tailored with transgenes with anti-catabolic or pro-anabolic effects to create cartilage-friendly therapeutic vehicles. On the other hand, tissue engineering constructs with scaffolds and biomaterials holds promising biological cartilage therapy. Many of these strategies have been validated in a wide range of in vitro and in vivo studies assessing treatment feasibility or efficacy. In this review, we provide an outline of the rationale and status of stem-cell-based treatments for OA cartilage, and we discuss prospects for clinical implementation and the factors crucial for maintaining the drive towards this goal.
Resumo:
This technical report is concerned with one aspect of environmental monitoring—the detection and analysis of acoustic events in sound recordings of the environment. Sound recordings offer ecologists the advantage of cheaper and increased sampling but make available so much data that automated analysis becomes essential. The report describes a number of tools for automated analysis of recordings, including noise removal from spectrograms, acoustic event detection, event pattern recognition, spectral peak tracking, syntactic pattern recognition applied to call syllables, and oscillation detection. These algorithms are applied to a number of animal call recognition tasks, chosen because they illustrate quite different modes of analysis: (1) the detection of diffuse events caused by wind and rain, which are frequent contaminants of recordings of the terrestrial environment; (2) the detection of bird and calls; and (3) the preparation of acoustic maps for whole ecosystem analysis. This last task utilises the temporal distribution of events over a daily, monthly or yearly cycle.
Resumo:
Research in structural dynamics has received considerable attention due to problems associated with emerging slender structures, increased vulnerability of structures to random loads and aging infrastructure. This paper briefly describes some such research carried out on i) dynamics of composite floor structure, ii) dynamics of cable supported footbridge, iii) seismic mitigation of frame-shear wall structure using passive dampers and iv) development of a damage assessment model for use in structural health modelling.
Resumo:
Intelligible and accurate risk-based decision-making requires a complex balance of information from different sources, appropriate statistical analysis of this information and consequent intelligent inference and decisions made on the basis of these analyses. Importantly, this requires an explicit acknowledgement of uncertainty in the inputs and outputs of the statistical model. The aim of this paper is to progress a discussion of these issues in the context of several motivating problems related to the wider scope of agricultural production. These problems include biosecurity surveillance design, pest incursion, environmental monitoring and import risk assessment. The information to be integrated includes observational and experimental data, remotely sensed data and expert information. We describe our efforts in addressing these problems using Bayesian models and Bayesian networks. These approaches provide a coherent and transparent framework for modelling complex systems, combining the different information sources, and allowing for uncertainty in inputs and outputs. While the theory underlying Bayesian modelling has a long and well established history, its application is only now becoming more possible for complex problems, due to increased availability of methodological and computational tools. Of course, there are still hurdles and constraints, which we also address through sharing our endeavours and experiences.
Resumo:
As computer applications become more available—both technically and economically—construction project managers are increasingly able to access advanced computer tools capable of transforming the role that project managers have typically performed. Competence at using these tools requires a dual commitment in training—from the individual and the firm. Improving the computer skills of project managers can provide construction firms with a competitive advantage to differentiate from others in an increasingly competitive international market. Yet, few published studies have quantified what existing level of competence construction project managers have. Identification of project managers’ existing computer application skills is a necessary first step to developing more directed training to better capture the benefits of computer applications. This paper discusses the yet to be released results of a series of surveys undertaken in Malaysia, Singapore, Indonesia, Australia and the United States through QUT’s School of Construction Management and Property and the M.E. Rinker, Sr. School of Building Construction at the University of Florida. This international survey reviews the use and reported competence in using a series of commercially-available computer applications by construction project managers. The five different country locations of the survey allow cross-national comparisons to be made between project managers undertaking continuing professional development programs. The results highlight a shortfall in the ability of construction project managers to capture potential benefits provided by advanced computer applications and provide directions for targeted industry training programs. This international survey also provides a unique insight to the cross-national usage of advanced computer applications and forms an important step in this ongoing joint review of technology and the construction project manager.
Resumo:
A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.
Resumo:
In today’s information society, electronic tools, such as computer networks for the rapid transfer of data and composite databases for information storage and management, are critical in ensuring effective environmental management. In particular environmental policies and programs for federal, state, and local governments need a large volume of up-to-date information on the quality of water, air, and soil in order to conserve and protect natural resources and to carry out meteorology. In line with this, the utilization of information and communication technologies (ICTs) is crucial to preserve and improve the quality of life. In handling tasks in the field of environmental protection a range of environmental and technical information is often required for a complex and mutual decision making in a multidisciplinary team environment. In this regard e-government provides a foundation of the transformative ICT initiative which can lead to better environmental governance, better services, and increased public participation in environmental decision- making process.
Resumo:
This paper demonstrates the capabilities of wavelet transform (WT) for analyzing important features related to bottleneck activations and traffic oscillations in congested traffic in a systematic manner. In particular, the analysis of loop detector data from a freeway shows that the use of wavelet-based energy can effectively identify the location of an active bottleneck, the arrival time of the resulting queue at each upstream sensor location, and the start and end of a transition during the onset of a queue. Vehicle trajectories were also analyzed using WT and our analysis shows that the wavelet-based energies of individual vehicles can effectively detect the origins of deceleration waves and shed light on possible triggers (e.g., lane-changing). The spatiotemporal propagations of oscillations identified by tracing wavelet-based energy peaks from vehicle to vehicle enable analysis of oscillation amplitude, duration and intensity.
Resumo:
Raman spectroscopy has been used to study selected mineral samples of the copiapite group. Copiapite (Fe2+Fe3+(SO4)6(OH)2 · 20H2O) is a secondary mineral formed through the oxidn. of pyrite. Minerals of the copiapite group have the general formula AFe4(SO4)6(OH)2 · 20H2O, where A has a + 2 charge and can be either magnesium, iron, copper, calcium and/or zinc. The formula can also be B2/3Fe4(SO4)6(OH)2 · 20H2O, where B has a + 3 charge and may be either aluminum or iron. For each mineral, two Raman bands are obsd. at around 992 and 1029 cm-1, assigned to the (SO4)2-ν1 sym. stretching mode. The observation of two bands provides evidence for the existence of two non-equiv. sulfate anions in the mineral structure. Three Raman bands at 1112, 1142 and 1161 cm-1 are obsd. in the Raman spectrum of copiapites, indicating a redn. of symmetry of the sulfate anion in the copiapite structure. This redn. in symmetry is supported by multiple bands in the ν2 and ν4(SO4)2- spectral regions.