883 resultados para active distribution system
Resumo:
Cassava contributes significantly to biobased material development. Conventional approaches for its bio-derivative-production and application cause significant wastes, tailored material development challenges, with negative environmental impact and application limitations. Transforming cassava into sustainable value-added resources requires redesigning new approaches. Harnessing unexplored material source, and downstream process innovations can mitigate challenges. The ultimate goal proposed an integrated sustainable process system for cassava biomaterial development and potential application. An improved simultaneous release recovery cyanogenesis (SRRC) methodology, incorporating intact bitter cassava, was developed and standardized. Films were formulated, characterised, their mass transport behaviour, simulating real-distribution-chain conditions quantified, and optimised for desirable properties. Integrated process design system, for sustainable waste-elimination and biomaterial development, was developed. Films and bioderivatives for desired MAP, fast-delivery nutraceutical excipients and antifungal active coating applications were demonstrated. SRRC-processed intact bitter cassava produced significantly higher yield safe bio-derivatives than peeled, guaranteeing 16% waste-elimination. Process standardization transformed entire root into higher yield and clarified colour bio-derivatives and efficient material balance at optimal global desirability. Solvent mass through temperature-humidity-stressed films induced structural changes, and influenced water vapour and oxygen permeability. Sevenunit integrated-process design led to cost-effectiveness, energy-efficient and green cassava processing and biomaterials with zero-environment footprints. Desirable optimised bio-derivatives and films demonstrated application in desirable in-package O2/CO2, mouldgrowth inhibition, faster tablet excipient nutraceutical dissolutions and releases, and thymolencapsulated smooth antifungal coatings. Novel material resources, non-root peeling, zero-waste-elimination, and desirable standardised methodology present promising process integration tools for sustainable cassava biobased system development. Emerging design outcomes have potential applications to mitigate cyanide challenges and provide bio-derivative development pathways. Process system leads to zero-waste, with potential to reshape current style one-way processes into circular designs modelled on nature's effective approaches. Indigenous cassava components as natural material reinforcements, and SRRC processing approach has initiated a process with potential wider deployment in broad product research development. This research contributes to scientific knowledge in material science and engineering process design.
Resumo:
A new Expiratory Droplet Investigation System (EDIS) was used to conduct the most comprehensive program of study to date, of the dilution corrected droplet size distributions produced during different respiratory activities.----- Distinct physiological processes were responsible for specific size distribution modes. The majority of particles for all activities were produced in one or more modes, with diameters below 0.8 µm. That mode occurred during all respiratory activities, including normal breathing. A second mode at 1.8 µm was produced during all activities, but at lower concentrations.----- Speech produced particles in modes near 3.5 µm and 5 µm. The modes became most pronounced during continuous vocalization, suggesting that the aerosolization of secretions lubricating the vocal chords is a major source of droplets in terms of number.----- Non-eqilibrium droplet evaporation was not detectable for particles between 0.5 and 20 μm implying that evaporation to the equilibrium droplet size occurred within 0.8 s.
Resumo:
This report presents the demonstration of software agents prototype system for improving maintenance management [AIMM] including: • Developing and implementing a user focused approach for mining the maintenance data of buildings. This report presents the demonstration of software agents prototype system for improving maintenance management [AIMM] including: • Developing and implementing a user focused approach for mining the maintenance data of buildings. • Refining the development of a multi agent system for data mining in virtual environments (Active Worlds) by developing and implementing a filtering agent on the results obtained from applying data mining techniques on the maintenance data. • Integrating the filtering agent within the multi agents system in an interactive networked multi-user 3D virtual environment. • Populating maintenance data and discovering new rules of knowledge.
Resumo:
There is evidence that many heating, ventilating & air conditioning (HVAC) systems, installed in larger buildings, have more capacity than is ever required to keep the occupants comfortable. This paper explores the reasons why this can occur, by examining a typical brief/design/documentation process. Over-sized HVAC systems cost more to install and operate and may not be able to control thermal comfort as well as a “right-sized” system. These impacts are evaluated, where data exists. Finally, some suggestions are developed to minimise both the extent of, and the negative impacts of, HVAC system over-sizing, for example: • Challenge “rules of thumb” and/or brief requirements which may be out of date. • Conduct an accurate load estimate, using AIRAH design data, specific to project location, and then resist the temptation to apply “safety factors • Use a load estimation program that accounts for thermal storage and diversification of peak loads for each zone and air handling system. • Select chiller sizes and staged or variable speed pumps and fans to ensure good part load performance. • Allow for unknown future tenancies by designing flexibility into the system, not by over-sizing. For example, generous sizing of distribution pipework and ductwork will allow available capacity to be redistributed. • Provide an auxiliary tenant condenser water loop to handle high load areas. • Consider using an Integrated Design Process, build an integrated load and energy use simulation model and test different operational scenarios • Use comprehensive Life Cycle Cost analysis for selection of the most optimal design solutions. This paper is an interim report on the findings of CRC-CI project 2002-051-B, Right-Sizing HVAC Systems, which is due for completion in January 2006.
Resumo:
Matching method of heavy truck-rear air suspensions is discussed, and a fuzzy control strategy which improves both ride comfort and road friendliness of truck by adjusting damping coefficients of the suspension system is found. In the first place, a Dongfeng EQ1141G7DJ heavy truck’s ten DOF whole vehicle-road model was set up based on Matlab/Simulink and vehicle dynamics. Then appropriate passive air suspensions were chosen to replace the original rear leaf springs of the truck according to truck-suspension matching criterions, consequently, the stiffness of front leaf springs were adjusted too. Then the semi-active fuzzy controllers were designed for further enhancement of the truck’s ride comfort and the road friendliness. After the application of semi-active fuzzy control strategy through simulation, is was indicated that both ride comfort and road friendliness could be enhanced effectively under various road conditions. The strategy proposed may provide theory basis for design and development of truck suspension system in China.
Resumo:
Emissions from airport operations are of significant concern because of their potential impact on local air quality and human health. The currently limited scientific knowledge of aircraft emissions is an important issue worldwide, when considering air pollution associated with airport operation, and this is especially so for ultrafine particles. This limited knowledge is due to scientific complexities associated with measuring aircraft emissions during normal operations on the ground. In particular this type of research has required the development of novel sampling techniques which must take into account aircraft plume dispersion and dilution as well as the various particle dynamics that can affect the measurements of the aircraft engine plume from an operational aircraft. In order to address this scientific problem, a novel mobile emission measurement method called the Plume Capture and Analysis System (PCAS), was developed and tested. The PCAS permits the capture and analysis of aircraft exhaust during ground level operations including landing, taxiing, takeoff and idle. The PCAS uses a sampling bag to temporarily store a sample, providing sufficient time to utilize sensitive but slow instrumental techniques to be employed to measure gas and particle emissions simultaneously and to record detailed particle size distributions. The challenges in relation to the development of the technique include complexities associated with the assessment of the various particle loss and deposition mechanisms which are active during storage in the PCAS. Laboratory based assessment of the method showed that the bag sampling technique can be used to accurately measure particle emissions (e.g. particle number, mass and size distribution) from a moving aircraft or vehicle. Further assessment of the sensitivity of PCAS results to distance from the source and plume concentration was conducted in the airfield with taxiing aircraft. The results showed that the PCAS is a robust method capable of capturing the plume in only 10 seconds. The PCAS is able to account for aircraft plume dispersion and dilution at distances of 60 to 180 meters downwind of moving a aircraft along with particle deposition loss mechanisms during the measurements. Characterization of the plume in terms of particle number, mass (PM2.5), gaseous emissions and particle size distribution takes only 5 minutes allowing large numbers of tests to be completed in a short time. The results were broadly consistent and compared well with the available data. Comprehensive measurements and analyses of the aircraft plumes during various modes of the landing and takeoff (LTO) cycle (e.g. idle, taxi, landing and takeoff) were conducted at Brisbane Airport (BNE). Gaseous (NOx, CO2) emission factors, particle number and mass (PM2.5) emission factors and size distributions were determined for a range of Boeing and Airbus aircraft, as a function of aircraft type and engine thrust level. The scientific complexities including the analysis of the often multimodal particle size distributions to describe the contributions of different particle source processes during the various stages of aircraft operation were addressed through comprehensive data analysis and interpretation. The measurement results were used to develop an inventory of aircraft emissions at BNE, including all modes of the aircraft LTO cycle and ground running procedures (GRP). Measurements of the actual duration of aircraft activity in each mode of operation (time-in-mode) and compiling a comprehensive matrix of gas and particle emission rates as a function of aircraft type and engine thrust level for real world situations was crucial for developing the inventory. The significance of the resulting matrix of emission rates in this study lies in the estimate it provides of the annual particle emissions due to aircraft operations, especially in terms of particle number. In summary, this PhD thesis presents for the first time a comprehensive study of the particle and NOx emission factors and rates along with the particle size distributions from aircraft operations and provides a basis for estimating such emissions at other airports. This is a significant addition to the scientific knowledge in terms of particle emissions from aircraft operations, since the standard particle number emissions rates are not currently available for aircraft activities.
Groundwater flow model of the Logan river alluvial aquifer system Josephville, South East Queensland
Resumo:
The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.
Resumo:
The study reported here, constitutes a full review of the major geological events that have influenced the morphological development of the southeast Queensland region. Most importantly, it provides evidence that the region’s physiography continues to be geologically ‘active’ and although earthquakes are presently few and of low magnitude, many past events and tectonic regimes continue to be strongly influential over drainage, morphology and topography. Southeast Queensland is typified by highland terrain of metasedimentary and igneous rocks that are parallel and close to younger, lowland coastal terrain. The region is currently situated in a passive margin tectonic setting that is now under compressive stress, although in the past, the region was subject to alternating extensional and compressive regimes. As part of the investigation, the effects of many past geological events upon landscape morphology have been assessed at multiple scales using features such as the location and orientation of drainage channels, topography, faults, fractures, scarps, cleavage, volcanic centres and deposits, and recent earthquake activity. A number of hypotheses for local geological evolution are proposed and discussed. This study has also utilised a geographic information system (GIS) approach that successfully amalgamates the various types and scales of datasets used. A new method of stream ordination has been developed and is used to compare the orientation of channels of similar orders with rock fabric, in a topologically controlled approach that other ordering systems are unable to achieve. Stream pattern analysis has been performed and the results provide evidence that many drainage systems in southeast Queensland are controlled by known geological structures and by past geological events. The results conclude that drainage at a fine scale is controlled by cleavage, joints and faults, and at a broader scale, large river valleys, such as those of the Brisbane River and North Pine River, closely follow the location of faults. These rivers appear to have become entrenched by differential weathering along these planes of weakness. Significantly, stream pattern analysis has also identified some ‘anomalous’ drainage that suggests the orientations of these watercourses are geologically controlled, but by unknown causes. To the north of Brisbane, a ‘coastal drainage divide’ has been recognized and is described here. The divide crosses several lithological units of different age, continues parallel to the coast and prevents drainage from the highlands flowing directly to the coast for its entire length. Diversion of low order streams away from the divide may be evidence that a more recent process may be the driving force. Although there is no conclusive evidence for this at present, it is postulated that the divide may have been generated by uplift or doming associated with mid-Cenozoic volcanism or a blind thrust at depth. Also north of Brisbane, on the D’Aguilar Range, an elevated valley (the ‘Kilcoy Gap’) has been identified that may have once drained towards the coast and now displays reversed drainage that may have resulted from uplift along the coastal drainage divide and of the D’Aguilar blocks. An assessment of the distribution and intensity of recent earthquakes in the region indicates that activity may be associated with ancient faults. However, recent movement on these faults during these events would have been unlikely, given that earthquakes in the region are characteristically of low magnitude. There is, however, evidence that compressive stress is building and being released periodically and ancient faults may be a likely place for this stress to be released. The relationship between ancient fault systems and the Tweed Shield Volcano has also been discussed and it is suggested here that the volcanic activity was associated with renewed faulting on the Great Moreton Fault System during the Cenozoic. The geomorphology and drainage patterns of southeast Queensland have been compared with expected morphological characteristics found at passive and other tectonic settings, both in Australia and globally. Of note are the comparisons with the East Brazilian Highlands, the Gulf of Mexico and the Blue Ridge Escarpment, for example. In conclusion, the results of the study clearly show that, although the region is described as a passive margin, its complex, past geological history and present compressive stress regime provide a more intricate and varied landscape than would be expected along typical passive continental margins. The literature review provides background to the subject and discusses previous work and methods, whilst the findings are presented in three peer-reviewed, published papers. The methods, hypotheses, suggestions and evidence are discussed at length in the final chapter.
Resumo:
1. Species' distribution modelling relies on adequate data sets to build reliable statistical models with high predictive ability. However, the money spent collecting empirical data might be better spent on management. A less expensive source of species' distribution information is expert opinion. This study evaluates expert knowledge and its source. In particular, we determine whether models built on expert knowledge apply over multiple regions or only within the region where the knowledge was derived. 2. The case study focuses on the distribution of the brush-tailed rock-wallaby Petrogale penicillata in eastern Australia. We brought together from two biogeographically different regions substantial and well-designed field data and knowledge from nine experts. We used a novel elicitation tool within a geographical information system to systematically collect expert opinions. The tool utilized an indirect approach to elicitation, asking experts simpler questions about observable rather than abstract quantities, with measures in place to identify uncertainty and offer feedback. Bayesian analysis was used to combine field data and expert knowledge in each region to determine: (i) how expert opinion affected models based on field data and (ii) how similar expert-informed models were within regions and across regions. 3. The elicitation tool effectively captured the experts' opinions and their uncertainties. Experts were comfortable with the map-based elicitation approach used, especially with graphical feedback. Experts tended to predict lower values of species occurrence compared with field data. 4. Across experts, consensus on effect sizes occurred for several habitat variables. Expert opinion generally influenced predictions from field data. However, south-east Queensland and north-east New South Wales experts had different opinions on the influence of elevation and geology, with these differences attributable to geological differences between these regions. 5. Synthesis and applications. When formulated as priors in Bayesian analysis, expert opinion is useful for modifying or strengthening patterns exhibited by empirical data sets that are limited in size or scope. Nevertheless, the ability of an expert to extrapolate beyond their region of knowledge may be poor. Hence there is significant merit in obtaining information from local experts when compiling species' distribution models across several regions.
Resumo:
An algorithm based on the concept of Kalman filtering is proposed in this paper for the estimation of power system signal attributes, like amplitude, frequency and phase angle. This technique can be used in protection relays, digital AVRs, DSTATCOMs, FACTS and other power electronics applications. Furthermore this algorithm is particularly suitable for the integration of distributed generation sources to power grids when fast and accurate detection of small variations of signal attributes are needed. Practical considerations such as the effect of noise, higher order harmonics, and computational issues of the algorithm are considered and tested in the paper. Several computer simulations are presented to highlight the usefulness of the proposed approach. Simulation results show that the proposed technique can simultaneously estimate the signal attributes, even if it is highly distorted due to the presence of non-linear loads and noise.
Resumo:
This paper proposes a method enhancing stability of an autonomous microgrid with distribution static compensator (DSTATCOM) and power sharing with multiple distributed generators (DG). It is assumed that all the DGs are connected through voltage source converter (VSC) and all connected loads are passive, making the microgrid totally inertia less. The VSCs are controlled by either state feedback or current feedback mode to achieve desired voltage-current or power outputs respectively. A modified angle droop is used for DG voltage reference generation. Power sharing ratio of the proposed droop control is established through derivation and verified by simulation results. A DSTATCOM is connected in the microgrid to provide ride through capability during power imbalance in the microgrid, thereby enhancing the system stability. This is established through extensive simulation studies using PSCAD.
Resumo:
This paper argues a model of complex system design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of the efficient use of energy and material resource in life-cycle of buildings, the active involvement of the occupants in micro-climate control within buildings, and the natural environmental context. The interactions of the parameters compose a complex system of sustainable architectural design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The complexity theory of dissipative structure states a microscopic formulation of open system evolution, which provides a system design framework for the evolution of building environmental performance towards an optimization of sustainability in architecture.
Resumo:
Automatic recognition of people is an active field of research with important forensic and security applications. In these applications, it is not always possible for the subject to be in close proximity to the system. Voice represents a human behavioural trait which can be used to recognise people in such situations. Automatic Speaker Verification (ASV) is the process of verifying a persons identity through the analysis of their speech and enables recognition of a subject at a distance over a telephone channel { wired or wireless. A significant amount of research has focussed on the application of Gaussian mixture model (GMM) techniques to speaker verification systems providing state-of-the-art performance. GMM's are a type of generative classifier trained to model the probability distribution of the features used to represent a speaker. Recently introduced to the field of ASV research is the support vector machine (SVM). An SVM is a discriminative classifier requiring examples from both positive and negative classes to train a speaker model. The SVM is based on margin maximisation whereby a hyperplane attempts to separate classes in a high dimensional space. SVMs applied to the task of speaker verification have shown high potential, particularly when used to complement current GMM-based techniques in hybrid systems. This work aims to improve the performance of ASV systems using novel and innovative SVM-based techniques. Research was divided into three main themes: session variability compensation for SVMs; unsupervised model adaptation; and impostor dataset selection. The first theme investigated the differences between the GMM and SVM domains for the modelling of session variability | an aspect crucial for robust speaker verification. Techniques developed to improve the robustness of GMMbased classification were shown to bring about similar benefits to discriminative SVM classification through their integration in the hybrid GMM mean supervector SVM classifier. Further, the domains for the modelling of session variation were contrasted to find a number of common factors, however, the SVM-domain consistently provided marginally better session variation compensation. Minimal complementary information was found between the techniques due to the similarities in how they achieved their objectives. The second theme saw the proposal of a novel model for the purpose of session variation compensation in ASV systems. Continuous progressive model adaptation attempts to improve speaker models by retraining them after exploiting all encountered test utterances during normal use of the system. The introduction of the weight-based factor analysis model provided significant performance improvements of over 60% in an unsupervised scenario. SVM-based classification was then integrated into the progressive system providing further benefits in performance over the GMM counterpart. Analysis demonstrated that SVMs also hold several beneficial characteristics to the task of unsupervised model adaptation prompting further research in the area. In pursuing the final theme, an innovative background dataset selection technique was developed. This technique selects the most appropriate subset of examples from a large and diverse set of candidate impostor observations for use as the SVM background by exploiting the SVM training process. This selection was performed on a per-observation basis so as to overcome the shortcoming of the traditional heuristic-based approach to dataset selection. Results demonstrate the approach to provide performance improvements over both the use of the complete candidate dataset and the best heuristically-selected dataset whilst being only a fraction of the size. The refined dataset was also shown to generalise well to unseen corpora and be highly applicable to the selection of impostor cohorts required in alternate techniques for speaker verification.
Resumo:
The challenge of persistent navigation and mapping is to develop an autonomous robot system that can simultaneously localize, map and navigate over the lifetime of the robot with little or no human intervention. Most solutions to the simultaneous localization and mapping (SLAM) problem aim to produce highly accurate maps of areas that are assumed to be static. In contrast, solutions for persistent navigation and mapping must produce reliable goal-directed navigation outcomes in an environment that is assumed to be in constant flux. We investigate the persistent navigation and mapping problem in the context of an autonomous robot that performs mock deliveries in a working office environment over a two-week period. The solution was based on the biologically inspired visual SLAM system, RatSLAM. RatSLAM performed SLAM continuously while interacting with global and local navigation systems, and a task selection module that selected between exploration, delivery, and recharging modes. The robot performed 1,143 delivery tasks to 11 different locations with only one delivery failure (from which it recovered), traveled a total distance of more than 40 km over 37 hours of active operation, and recharged autonomously a total of 23 times.