957 resultados para Chile Power Food
Resumo:
In the education of physical sciences, the role of the laboratory cannot be overemphasised. It is the laboratory exercises which enable the student to assimilate the theoretical basis, verify the same through bench-top experiments, and internalize the subject discipline to acquire mastery of the same. However the resources essential to put together such an environment is substantial. As a result, the students go through a curriculum which is wanting in this respect. This paper presents a low cost alternative to impart such an experience to the student aimed at the subject of switched mode power conversion. The resources are based on an open source circuit simulator (Sequel) developed at IIT Mumbai, and inexpensive construction kits developed at IISc Bangalore. The Sequel programme developed by IIT Mumbai, is a circuit simulation program under linux operating system distributed free of charge. The construction kits developed at IISc Bangalore, is fully documented for anyone to assemble these circuit which minimal equipment such as soldering iron, multimeter, power supply etc. This paper puts together a simple forward dc to dc converter as a vehicle to introduce the programming under sequel to evaluate the transient performance and small signal dynamic model of the same. Bench tests on the assembled construction kit may be done by the student for study of operation, transient performance and closed loop stability margins etc.
Resumo:
This paper presents a power, latency and throughput trade-off study on NoCs by varying microarchitectural (e.g. pipelining) and circuit level (e.g. frequency and voltage) parameters. We change pipelining depth, operating frequency and supply voltage for 3 example NoCs - 16 node 2D Torus, Tree network and Reduced 2D Torus. We use an in-house NoC exploration framework capable of topology generation and comparison using parameterized models of Routers and links developed in SystemC. The framework utilizes interconnect power and delay models from a low-level modelling tool called Intacte[1]1. We find that increased pipelining can actually reduce latency. We also find that there exists an optimal degree of pipelining which is the most energy efficient in terms of minimizing energy-delay product.
Resumo:
Many wireless applications demand a fast mechanism to detect the packet from a node with the highest priority ("best node") only, while packets from nodes with lower priority are irrelevant. In this paper, we introduce an extremely fast contention-based multiple access algorithm that selects the best node and requires only local information of the priorities of the nodes. The algorithm, which we call Variable Power Multiple Access Selection (VP-MAS), uses the local channel state information from the accessing nodes to the receiver, and maps the priorities onto the receive power. It is based on a key result that shows that mapping onto a set of discrete receive power levels is optimal, when the power levels are chosen to exploit packet capture that inherently occurs in a wireless physical layer. The VP-MAS algorithm adjusts the expected number of users that contend in each step and their respective transmission powers, depending on whether previous transmission attempts resulted in capture, idle channel, or collision. We also show how reliable information regarding the total received power at the receiver can be used to improve the algorithm by enhancing the feedback mechanism. The algorithm detects the packet from the best node in 1.5 to 2.1 slots, which is considerably lower than the 2.43 slot average achieved by the best algorithm known to date.
Resumo:
The power system network is assumed to be in steady-state even during low frequency transients. However, depending on generator dynamics, and toad and control characteristics, the system model and the nature of power flow equations can vary The nature of power flow equations describing the system during a contingency is investigated in detail. It is shown that under some mild assumptions on load-voltage characteristics, the power flow equations can be decoupled in an exact manner. When the generator dynamics are considered, the solutions for the load voltages are exact if load nodes are not directly connected to each other
Resumo:
The application of multilevel control strategies for load-frequency control of interconnected power systems is assuming importance. A large multiarea power system may be viewed as an interconnection of several lower-order subsystems, with possible change of interconnection pattern during operation. The solution of the control problem involves the design of a set of local optimal controllers for the individual areas, in a completely decentralised environment, plus a global controller to provide the corrective signal to account for interconnection effects. A global controller, based on the least-square-error principle suggested by Siljak and Sundareshan, has been applied for the LFC problem. A more recent work utilises certain possible beneficial aspects of interconnection to permit more desirable system performances. The paper reports the application of the latter strategy to LFC of a two-area power system. The power-system model studied includes the effects of excitation system and governor controls. A comparison of the two strategies is also made.
Resumo:
This paper presents three methodologies for determining optimum locations and magnitudes of reactive power compensation in power distribution systems. Method I and Method II are suitable for complex distribution systems with a combination of both radial and ring-main feeders and having different voltage levels. Method III is suitable for low-tension single voltage level radial feeders. Method I is based on an iterative scheme with successive powerflow analyses, with formulation and solution of the optimization problem using linear programming. Method II and Method III are essentially based on the steady state performance of distribution systems. These methods are simple to implement and yield satisfactory results comparable with the results of Method I. The proposed methods have been applied to a few distribution systems, and results obtained for two typical systems are presented for illustration purposes.
Resumo:
The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Power calculation and sample size determination are critical in designing environmental monitoring programs. The traditional approach based on comparing the mean values may become statistically inappropriate and even invalid when substantial proportions of the response values are below the detection limits or censored because strong distributional assumptions have to be made on the censored observations when implementing the traditional procedures. In this paper, we propose a quantile methodology that is robust to outliers and can also handle data with a substantial proportion of below-detection-limit observations without the need of imputing the censored values. As a demonstration, we applied the methods to a nutrient monitoring project, which is a part of the Perth Long-Term Ocean Outlet Monitoring Program. In this example, the sample size required by our quantile methodology is, in fact, smaller than that by the traditional t-test, illustrating the merit of our method.
Resumo:
Työ käsittelee Rooman laivaston kehitystä, toimintaa ja osallistumista laajenemispolitiikkaan, jossa Rooma kasvoi kaupunkivaltiosta Välimeren hallitsijaksi. Rooma on aikaisemmissa tutkimuksissa nähty maavaltiona vailla kiinnostusta merenkulkuun. On katsottu, että ainoa merkittävä merisota Rooman historiassa on ensimmäinen puunilaissota (264-241 eKr.) ja että siinäkin roomalaiset (jotka historioitsija Polybius kuvaa vasta-alkajiksi) menestyivät Karthagoa vastaan turvautumalla laskusiltoihin, joiden avulla he saattoivat muuttaa meritaistelun maataisteluksi. Polybiuksen kuvaukseen on aina tähän asti luotettu. On katsottu että Roomalla ei ollut laivastoa ennen ensimmäistä puunilaissotaa ja että Rooma kaikissa sodissaan panosti merisodankäyntiin mahdollisimman vähän. Tämä työ pyrkii kumoamaan nämä käsitykset. Laivasto oli osallisena ja ehdottoman välttämätön kaikissa Rooman laajenemispolitiikan käänteissä. Arkeologian tiedot osoittavat, että ennen ensimmäistä puunilaissotaa Rooma kehittyi ja siitä tuli merkittävä kaupunki nimenomaan kaupankäynnin ja ulkomaisten kontaktien seurauksena. Se ei siis ollut puhdas agraarivaltio. Roomalaisilla oli laivasto jo viimeistään 500-luvulta lähtien eKr. ja sitä käytettiin Rooman laajentaessa valtaansa Italiassa. Näin ollen ensimmäisessä puunilaissodassa läntisen Välimeren herruudesta kilpaili kaksi merivaltiota, Rooma ja Karthago. Toinen puunilaissota (218-201) tunnetaan yleensä Hannibalin tulosta Alppien yli Italiaan, mutta se oli myöskin merkittävä merisota ja karthagolaiset hävisivät sen nimenomaan merellä. Rooma osallistui kilpailuun itäisen Välimeren hallinnasta ja kukisti Makedonian ja Syyrian laivastot, jotka eivät olleet mitenkään Rooman laivaston veroisia. Kaikista Rooman vastustajista Karthagolla olisi ollut suurin mahdollisuus pysäyttää Rooman laivaston voittokulu toisessa puunilaissodassa. Laivastoa käytettiin moniin eri tarkoituksiin. Suuret meritaistelut eivät ole ainoa osoitus laivastojen mukanaolosta ja merkityksestä, vaan on myös otettava huomion sotalaivojen rakenne ja toimintaedellytykset. Sotalaivat oli rakennettu taisteluita varten ja niissä oli hyvin niukasti säilytystilaa. Niiden oli päästävä laskemaan maihin aina kun miehistö tarvitsi vettä, ruokaa ja lepoa. Laivastot saattoivat toimia vain niiden rannikoiden tuntumassa, joiden satamiin ja laskupaikkoihin niillä oli turvallinen pääsy. Roomalaiset olivat hyvin tietoisia tästä. Suuret merentakaiset sotaretket Afrikkaan, Espanjaan, Kreikkaan ja Vähän-Aasian rannikolle perustuivat kaikki siihen, että Rooman laivasto hallitsi purjehdusreittejä ja sopivia laskupaikkoja ja saattoi huolehtia joukkojen ja varusteiden kuljettamisesta kaukana taisteleville armeijoille. Samalla Rooman laivasto kävi itsenäistä sotaa merellä ja haastoi ja kukisti kaikki Välimeren merivaltiot. 130-luvulle eKr. tultaessa se oli lyönyt vihollisensa ja riisunut aseista liittolaisensa; Rooman laivasto hallitsi Välimerta yksin.
Resumo:
The problem of optimal scheduling of the generation of a hydro-thermal power system that is faced with a shortage of energy is studied. The deterministic version of the problem is first analyzed, and the results are then extended to cases where the loads and the hydro inflows are random variables.
Resumo:
Despite ongoing controversies regarding possible directions for the nuclear plants program throughout Japan since the Fukushima disaster, little has been researched about people's belief structure about future society and what may affect their attitudes toward different policy options. Beyond policy debates, the present study focused on how people see a future society according to the assumptions of different policy options. A total of 125 students at Japanese universities were asked to compare a future society with society today in which one of alternative policies was adopted (i.e., shutdown or expansion of nuclear reactors) in terms of characteristics of individuals and society in general. While perceived dangerousness of nuclear power predicted attitudes and behavioural intentions to make personal sacrifices for nuclear power policies, beliefs about the social consequences of the policies, especially on economic development and dysfunction, appeared to play stronger roles in predicting those measures. The importance of sociological dimensions in understanding how people perceive the future of society regarding alternative nuclear power policies, and the subtle discrepancies between attitudes and behavioural intentions, are discussed.
Resumo:
The impact of global positioning systems (GPS) and plotter systems on the relative fishing power of the northern prawn fishery fleet on tiger prawns (Penaeus esculentus Haswell, 1879, and P. semisulcatus de Haan, 1850) was investigated from commercial catch data. A generalized linear model was used to account for differences in fishing power between boats and changes in prawn abundance. It was found that boats that used a GPS alone had 4% greater fishing power than boats without a CPS. The addition of a plotter raised the power by 7% over boats without the equipment. For each year between the first to third that a fisher has been working with plotters, there is an additional 2 or 3% increase. It appears that when all boats have a GPS and plotter for at least 3 years, the fishing power of the fleet will increase by 12%. Management controls have reduced the efficiency of each boat and lowered the number of days available to fish, but this may not have been sufficient to counteract the increases. Further limits will be needed to maintain the desired levels of mortality.
Resumo:
This paper describes a method of adjusting the stator power factor angle for the control of an induction motor fed from a current source inverter (CSI) based on the concept of space vectors (or park vectors). It is shown that under steady state, if the torque angle is kept constant over the entire operating range, it has the advantage of keeping the slip frequency constant. This can be utilized to dispose of the speed feedback and simplify the control scheme for the drive, such that the stator voltage integral zero crossings alone can be used as a feedback for deciding the triggering instants of the CSI thyristors under stable operation of the system. A closed-loop control strategy is developed for the drive based on this principle, using a microprocessor-based control system and is implemented on a laboratory prototype CSI fed induction motor drive.
Resumo:
This research is about jazz in Chile in relation to modernity and identity. Final chapters focus and detach latest jazz musician s generation in 1990 decade and composer guitarist Angel Parra. An historic and sociological approach is developed, which will be useful for modernity and identity analysis, and so on post modernity and globalization. Modernity has been studied in texts of Adorno, Baudrillard, Brünner, García Canclini, Habermas and Jameson. Identity has been studied in texts of Aharonián, Cordúa, Garretón, Gissi, Larraín and others. Chapter 3 is about Latin-American musicology and jazz investigations, in relation to approach developed in chapter 2. Chapters 4 and 5 are about history of jazz in Chile until beginning of XXI century. Chapter 6 focuses in Ángel Parra Orrego. Conclusions of this investigation detach the modernist mechanical that has conducted jazz development in Chile, which in Ángel Parra´s case has been overcame by a post modernist behaviour. This behaviour has solved in a creative way, subjects like modernity and identity in jazz practice in a Latin-American country.
Resumo:
Consumer risk assessment is a crucial step in the regulatory approval of pesticide use on food crops. Recently, an additional hurdle has been added to the formal consumer risk assessment process with the introduction of short-term intake or exposure assessment and a comparable short-term toxicity reference, the acute reference dose. Exposure to residues during one meal or over one day is important for short-term or acute intake. Exposure in the short term can be substantially higher than average because the consumption of a food on a single occasion can be very large compared with typical long-term or mean consumption and the food may have a much larger residue than average. Furthermore, the residue level in a single unit of a fruit or vegetable may be higher by a factor (defined as the variability factor, which we have shown to be typically ×3 for the 97.5th percentile unit) than the average residue in the lot. Available marketplace data and supervised residue trial data are examined in an investigation of the variability of residues in units of fruit and vegetables. A method is described for estimating the 97.5th percentile value from sets of unit residue data. Variability appears to be generally independent of the pesticide, the crop, crop unit size and the residue level. The deposition of pesticide on the individual unit during application is probably the most significant factor. The diets used in the calculations ideally come from individual and household surveys with enough consumers of each specific food to determine large portion sizes. The diets should distinguish the different forms of a food consumed, eg canned, frozen or fresh, because the residue levels associated with the different forms may be quite different. Dietary intakes may be calculated by a deterministic method or a probabilistic method. In the deterministic method the intake is estimated with the assumptions of large portion consumption of a ‘high residue’ food (high residue in the sense that the pesticide was used at the highest recommended label rate, the crop was harvested at the smallest interval after treatment and the residue in the edible portion was the highest found in any of the supervised trials in line with these use conditions). The deterministic calculation also includes a variability factor for those foods consumed as units (eg apples, carrots) to allow for the elevated residue in some single units which may not be seen in composited samples. In the probabilistic method the distribution of dietary consumption and the distribution of possible residues are combined in repeated probabilistic calculations to yield a distribution of possible residue intakes. Additional information such as percentage commodity treated and combination of residues from multiple commodities may be incorporated into probabilistic calculations. The IUPAC Advisory Committee on Crop Protection Chemistry has made 11 recommendations relating to acute dietary exposure.