962 resultados para system improvements
Resumo:
A microgrid can span over a large area, especially in rural townships. In such cases, the distributed generators (DGs) must be controlled in a decentralized fashion, based on the locally available measurements. The main concerns are control of system voltage magnitude and frequency, which can either lead to system instability or voltage collapse. In this chapter, the operational challenges of load frequency control in a microgrid are discussed and few methods are proposed to meet these challenges. In particular, issues of power sharing, power quality and system stability are addressed, when the system operates under decentralized control. The main focus of this chapter is to provide solutions to improve the system performance in different situations. The scenarios considered are (a) when the system stability margin is low, (b) when the line impedance has a high R to X ratio, (c) when the system contains unbalanced and/or distorted loads. Also a scheme is proposed in which a microgrid can be frequency isolated from a utility grid while being capable of bidirectional power transfer. In all these cases, the use of angle droop in converter interfaced DGs is adopted. It has been shown that this results in a more responsive control action compared to the traditional frequency based droop control.
Resumo:
In this thesis various schemes using custom power devices for power quality improvement in low voltage distribution network are studied. Customer operated distributed generators makes a typical network non-radial and affect the power quality. A scheme considering different algorithm of DSTATCOM is proposed for power circulation and islanded operation of the system. To compensate reactive power overflow and facilitate unity power factor, a UPQC is introduced. Stochastic analysis is carried out for different scenarios to get a comprehensive idea about a real life distribution network. Combined operation of static compensator and voltage regulator is tested for the optimum quality and stability of the system.
Resumo:
In a previous blog I was critical of the US health care system for not using cost-effectiveness information to plan their services. Today I’m going to talk about the implementation of innovation in health services, something the US does really well compared to Australia.
Resumo:
Decision-making is such an integral aspect in health care routine that the ability to make the right decisions at crucial moments can lead to patient health improvements. Evidence-based practice, the paradigm used to make those informed decisions, relies on the use of current best evidence from systematic research such as randomized controlled trials. Limitations of the outcomes from randomized controlled trials (RCT), such as “quantity” and “quality” of evidence generated, has lowered healthcare professionals’ confidence in using EBP. An alternate paradigm of Practice-Based Evidence has evolved with the key being evidence drawn from practice settings. Through the use of health information technology, electronic health records (EHR) capture relevant clinical practice “evidence”. A data-driven approach is proposed to capitalize on the benefits of EHR. The issues of data privacy, security and integrity are diminished by an information accountability concept. Data warehouse architecture completes the data-driven approach by integrating health data from multi-source systems, unique within the healthcare environment.
Resumo:
Peanut (Arachis hypogaea L.) is an economically important legume crop in irrigated production areas of northern Australia. Although the potential pod yield of the crop in these areas is about 8 t ha(-1), most growers generally obtain around 5 t ha(-1), partly due to poor irrigation management. Better information and tools that are easy to use, accurate, and cost-effective are therefore needed to help local peanut growers improve irrigation management. This paper introduces a new web-based decision support system called AQUAMAN that was developed to assist Australian peanut growers schedule irrigations. It simulates the timing and depth of future irrigations by combining procedures from the food and agriculture organization (FAO) guidelines for irrigation scheduling (FAO-56) with those of the agricultural production systems simulator (APSIM) modeling framework. Here, we present a description of AQUAMAN and results of a series of activities (i.e., extension activities, case studies, and a survey) that were conducted to assess its level of acceptance among Australian peanut growers, obtain feedback for future improvements, and evaluate its performance. Application of the tool for scheduling irrigations of commercial peanut farms since its release in 2004-2005 has shown good acceptance by local peanuts growers and potential for significantly improving yield. Limited comparison with the farmer practice of matching the pan evaporation demand during rain-free periods in 2006-2007 and 2008-2009 suggested that AQUAMAN enabled irrigation water savings of up to 50% and the realization of enhanced water and irrigation use efficiencies.
Resumo:
The study of the organisational culture in the construction industry is still in the stage of debate (Oney-Yazıcı et al., 2007). Despite the complexities involved in measuring the culture of the construction industry (Tijhuis and Fellows, 2012), this culture is regarded as being worthy of research, especially in relation to the organisational culture needed to support quality management systems (Koh and Low, 2008; Watson and Howarth, 2011) and to improve organisational effectiveness, and therefore, organisational performance (Coffey, 2010; Cheung et al., 2011). A number of recent studies have examined the construction companies’ organisational culture within the context of the use of Cameron and Quinn’s Competing Value Framework (CVF), as well as the use of their Organizational Culture Assessment Instrument (OCAI) as the conceptual paradigm for the analyses (Thomas et al., 2002; Nummelin, 2006; Oney- Yazıcı et al., 2007; Koh and Low, 2008). However, there has been little research based on the use of Cameron and Quinn’s CVF-OCAI tool for identifying types of construction companies’ organisational culture and their influences on the implementation of QMS-ISO 9001. Research output and information is also very limited relating to the strength of the companies’ organisational culture driving an effective QMS-ISO 9001 implementation, affecting the companies’ effectiveness. To rectify these research gaps, the research has been aimed to study organisational culture types (based on CVF) and their influences on the implementation of QMS-ISO 9001:2008 principles and elements, which eventually lead to improved companies’ quality performance. In order to fully examine the status of the QMS being implemented, the research has studied the relationships of the barriers of QMS implementation with the implementation of QMS-ISO 9001:2008 principles and elements and with the business performance of the companies, as well as the examination of the relationships of the implementation of QMS-ISO 9001:2008 principles and elements with the companies’ business performance. The research output has been the development of fundamental and original studies on the study topics, to provide the knowledge for improvements in Indonesian construction companies’ quality performance and quality outcomes.
Resumo:
Indoor air quality is a critical factor in the classroom due to high people concentration in a unique space. Indoor air pollutant might increase the chance of both long and short-term health problems among students and staff, reduce the productivity of teachers and degrade the student’s learning environment and comfort. Adequate air distribution strategies may reduce risk of infection in classroom. So, the purpose of air distribution systems in a classroom is not only to maximize conditions for thermal comfort, but also to remove indoor contaminants. Natural ventilation has the potential to play a significant role in achieving improvements in IAQ. The present study compares the risk of airborne infection between Natural Ventilation (opening windows and doors) and a Split-System Air Conditioner in a university classroom. The Wells-Riley model was used to predict the risk of indoor airborne transmission of infectious diseases such as influenza, measles and tuberculosis. For each case, the air exchange rate was measured using a CO2 tracer gas technique. It was found that opening windows and doors provided an air exchange rate of 2.3 air changes/hour (ACH), while with the Split System it was 0.6 ACH. The risk of airborne infection ranged between 4.24 to 30.86 % when using the Natural Ventilation and between 8.99 to 43.19% when using the Split System. The difference of airborne infection risk between the Split System and the Natural Ventilation ranged from 47 to 56%. Opening windows and doors maximize Natural Ventilation so that the risk of airborne contagion is much lower than with Split System.
Resumo:
Lignin is a hydrophobic polymer that is synthesised in the secondary cell walls of all vascular plants. It enables water conduction through the stem, supports the upright growth habit and protects against invading pathogens. In addition, lignin hinders the utilisation of the cellulosic cell walls of plants in pulp and paper industry and as forage. Lignin precursors are synthesised in the cytoplasm through the phenylpropanoid pathway, transported into the cell wall and oxidised by peroxidases or laccases to phenoxy radicals that couple to form the lignin polymer. This study was conducted to characterise the lignin biosynthetic pathway in Norway spruce (Picea abies (L.) Karst.). We focused on the less well-known polymerisation stage, to identify the enzymes and the regulatory mechanisms that are involved. Available data for lignin biosynthesis in gymnosperms is scarce and, for example, the latest improvements in precursor biosynthesis have only been verified in herbaceous plants. Therefore, we also wanted to study in detail the roles of individual gene family members during developmental and stress-induced lignification, using EST sequencing and real-time RT-PCR. We used, as a model, a Norway spruce tissue culture line that produces extracellular lignin into the culture medium, and showed that lignin polymerisation in the tissue culture depends on peroxidase activity. We identified in the culture medium a significant NADH oxidase activity that could generate H2O2 for peroxidases. Two basic culture medium peroxidases were shown to have high affinity to coniferyl alcohol. Conservation of the putative substrate-binding amino acids was observed when the spruce peroxidase sequences were compared with other peroxidases with high affinity to coniferyl alcohol. We also used different peroxidase fractions to produce synthetic in vitro lignins from coniferyl alcohol; however, the linkage pattern of the suspension culture lignin could not be reproduced in vitro with the purified peroxidases, nor with the full complement of culture medium proteins. This emphasised the importance of the precursor radical concentration in the reaction zone, which is controlled by the cells through the secretion of both the lignin precursors and the oxidative enzymes to the apoplast. In addition, we identified basic peroxidases that were reversibly bound to the lignin precipitate. They could be involved, for example, in the oxidation of polymeric lignin, which is required for polymer growth. The dibenzodioxocin substructure was used as a marker for polymer oxidation in the in vitro polymerisation studies, as it is a typical substructure in wood lignin and in the suspension culture lignin. Using immunolocalisation, we found the structure mainly in the S2+S3 layers of the secondary cell walls of Norway spruce tracheids. The structure was primarily formed during the late phases of lignification. Contrary to the earlier assumptions, it appears to be a terminal structure in the lignin macromolecule. Most lignin biosynthetic enzymes are encoded for by several genes, all of which may not participate in lignin biosynthesis. In order to identify the gene family members that are responsible for developmental lignification, ESTs were sequenced from the lignin-forming tissue culture and developing xylem of spruce. Expression of the identified lignin biosynthetic genes was studied using real-time RT-PCR. Candidate genes for developmental lignification were identified by a coordinated, high expression of certain genes within the gene families in all lignin-forming tissues. However, such coordinated expression was not found for peroxidase genes. We also studied stress-induced lignification either during compression wood formation by bending the stems or after Heterobasidion annosum infection. Based on gene expression profiles, stress-induced monolignol biosynthesis appeared similar to the developmental process, and only single PAL and C3H genes were specifically up-regulated by stress. On the contrary, the up-regulated peroxidase genes differed between developmental and stress-induced lignification, indicating specific responses.
Resumo:
This report summarises the development of an Unmanned Aerial System and an integrated Wireless Sensor Network (WSN), suitable for the real world application in remote sensing tasks. Several aspects are discussed and analysed to provide improvements in flight duration, performance and mobility of the UAV, while ensuring the accuracy and range of data from the wireless sensor system.
Resumo:
Computational grids with multiple batch systems (batch grids) can be powerful infrastructures for executing long-running multi-component parallel applications. In this paper, we evaluate the potential improvements in throughput of long-running multi-component applications when the different components of the applications are executed on multiple batch systems of batch grids. We compare the multiple batch executions with executions of the components on a single batch system without increasing the number of processors used for executions. We perform our analysis with a foremost long-running multi-component application for climate modeling, the Community Climate System Model (CCSM). We have built a robust simulator that models the characteristics of both the multi-component application and the batch systems. By conducting large number of simulations with different workload characteristics and queuing policies of the systems, processor allocations to components of the application, distributions of the components to the batch systems and inter-cluster bandwidths, we show that multiple batch executions lead to 55% average increase in throughput over single batch executions for long-running CCSM. We also conducted real experiments with a practical middleware infrastructure and showed that multi-site executions lead to effective utilization of batch systems for executions of CCSM and give higher simulation throughput than single-site executions. Copyright (c) 2011 John Wiley & Sons, Ltd.
Resumo:
In the last decades big improvements have been done in the field of computer aided learning, based on improvements done in computer science and computer systems. Although the field has been always a bit lagged, without using the latest solutions, it has constantly gone forward taking profit of the innovations as they show up. As long as the train of the computer science does not stop (and it won’t at least in the near future) the systems that take profit of those improvements will not either, because we humans will always need to study; Sometimes for pleasure and some other many times out of need. Not all the attempts in the field of computer aided learning have been in the same direction. Most of them address one or some few of the problems that show while studying and don’t take into account solutions proposed for some other problems. The reasons for this can be varied. Sometimes the solutions simply are not compatible. Some other times, because the project is an investigation it’s interesting to isolate the problem. And, in commercial products, licenses and patents often prevent the new projects to use previous work. The world moved forward and this is an attempt to use some of the options offered by technology, mixing some old ideas with new ones.
Resumo:
Adaptive optics (AO) corrects distortions created by atmospheric turbulence and delivers diffraction-limited images on ground-based telescopes. The vastly improved spatial resolution and sensitivity has been utilized for studying everything from the magnetic fields of sunspots upto the internal dynamics of high-redshift galaxies. This thesis about AO science from small and large telescopes is divided into two parts: Robo-AO and magnetar kinematics.
In the first part, I discuss the construction and performance of the world’s first fully autonomous visible light AO system, Robo-AO, at the Palomar 60-inch telescope. Robo-AO operates extremely efficiently with an overhead < 50s, typically observing about 22 targets every hour. We have performed large AO programs observing a total of over 7,500 targets since May 2012. In the visible band, the images have a Strehl ratio of about 10% and achieve a contrast of upto 6 magnitudes at a separation of 1′′. The full-width at half maximum achieved is 110–130 milli-arcsecond. I describe how Robo-AO is used to constrain the evolutionary models of low-mass pre-main-sequence stars by measuring resolved spectral energy distributions of stellar multiples in the visible band, more than doubling the current sample. I conclude this part with a discussion of possible future improvements to the Robo-AO system.
In the second part, I describe a study of magnetar kinematics using high-resolution near-infrared (NIR) AO imaging from the 10-meter Keck II telescope. Measuring the proper motions of five magnetars with a precision of upto 0.7 milli-arcsecond/yr, we have more than tripled the previously known sample of magnetar proper motions and proved that magnetar kinematics are equivalent to those of radio pulsars. We conclusively showed that SGR 1900+14 and SGR 1806-20 were ejected from the stellar clusters with which they were traditionally associated. The inferred kinematic ages of these two magnetars are 6±1.8 kyr and 650±300 yr respectively. These ages are a factor of three to four times greater than their respective characteristic ages. The calculated braking index is close to unity as compared to three for the vacuum dipole model and 2.5-2.8 as measured for young pulsars. I conclude this section by describing a search for NIR counterparts of new magnetars and a future promise of polarimetric investigation of a magnetars’ NIR emission mechanism.
Resumo:
Plankton and larval fish sampling programs often are limited by a balance between sampling frequency (for precision) and costs. Advancements in sampling techniques hold the potential to add considerable efficiency and, therefore, add sampling frequency to improve precision. We compare a newly developed plankton imaging system, In Situ Ichthyoplankton Imaging System (ISIIS), with a bongo sampler, which is a traditional plankton sampling gear developed in the 1960s. Comparative sampling was conducted along 2 transects ~30–40 km long. Over 2 days, we completed 36 ISIIS tow-yo undulations and 11 bongo oblique tows, each from the surface to within 10 m of the seafloor. Overall, the 2 gears detected comparable numbers of larval fishes, representing similar taxonomic compositions, although larvae captured with the bongo were capable of being identified to lower taxonomic levels, especially larvae in the small (<5 mm), preflexion stages. Size distributions of the sampled larval fishes differed considerably between these 2 sampling methods, with the size range and mean size of larval fishes larger with ISIIS than with the bongo sampler. The high frequency and fine spatial scale of ISIIS allow it to add considerable sampling precision (i.e., more vertical sections) to plankton surveys. Improvements in the ISIIS technology (including greater depth of field and image resolution) should also increase taxonomic resolution and decrease processing time. When coupled with appropriate net sampling (for the purpose of collecting and verifying the identification of biological samples), the use of ISIIS could improve overall survey design and simultaneously provide detailed, process-oriented information for fisheries scientists and oceanographers.
Resumo:
Based on a comprehensive theoretical optical orthogonal frequency division multiplexing (OOFDM) system model rigorously verified by comparing numerical results with end-to-end real-time experimental measurements at 11.25Gb/s, detailed explorations are undertaken, for the first time, of the impacts of various physical factors on the OOFDM system performance over directly modulated DFB laser (DML)-based, intensity modulation and direct detection (IMDD), single-mode fibre (SMF) systems without in-line optical amplification and chromatic dispersion compensation. It is shown that the low extinction ratio (ER) of the DML modulated OOFDM signal is the predominant factor limiting the maximum achievable optical power budget, and the subcarrier intermixing effect associated with square-law photon detection in the receiver reduces the optical power budget by at least 1dB. Results also indicate that, immediately after the DML in the transmitter, the insertion of a 0.02nm bandwidth optical Gaussian bandpass filter with a 0.01nm wavelength offset with respect to the optical carrier wavelength can enhance the OOFDM signal ER by approximately 1.24dB, thus resulting in a 7dB optical power budget improvement at a total channel BER of 1 × 10(-3).
Resumo:
The optimization of dialogue policies using reinforcement learning (RL) is now an accepted part of the state of the art in spoken dialogue systems (SDS). Yet, it is still the case that the commonly used training algorithms for SDS require a large number of dialogues and hence most systems still rely on artificial data generated by a user simulator. Optimization is therefore performed off-line before releasing the system to real users. Gaussian Processes (GP) for RL have recently been applied to dialogue systems. One advantage of GP is that they compute an explicit measure of uncertainty in the value function estimates computed during learning. In this paper, a class of novel learning strategies is described which use uncertainty to control exploration on-line. Comparisons between several exploration schemes show that significant improvements to learning speed can be obtained and that rapid and safe online optimisation is possible, even on a complex task. Copyright © 2011 ISCA.