924 resultados para state-controlled contexts
Resumo:
This paper presents a new algorithm based on a Modified Particle Swarm Optimization (MPSO) to estimate the harmonic state variables in a distribution networks. The proposed algorithm performs the estimation for both amplitude and phase of each injection harmonic currents by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as the uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WTs). The main features of the proposed MPSO algorithm are usage of a primary and secondary PSO loop and applying the mutation function. The simulation results on 34-bus IEEE radial and a 70-bus realistic radial test networks are presented. The results demonstrate that the speed and the accuracy of the proposed Distribution Harmonic State Estimation (DHSE) algorithm are very excellent compared to the algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO, and Honey Bees Mating Optimization (HBMO).
Resumo:
This paper presents a new algorithm based on honey-bee mating optimization (HBMO) to estimate harmonic state variables in distribution networks including distributed generators (DGs). The proposed algorithm performs estimation for both amplitude and phase of each harmonics by minimizing the error between the measured values from phasor measurement units (PMUs) and the values computed from the estimated parameters during the estimation process. Simulation results on two distribution test system are presented to demonstrate that the speed and accuracy of proposed distribution harmonic state estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as weight least square (WLS), genetic algorithm (GA) and tabu search (TS).
Resumo:
This paper presents a new algorithm based on a Hybrid Particle Swarm Optimization (PSO) and Simulated Annealing (SA) called PSO-SA to estimate harmonic state variables in distribution networks. The proposed algorithm performs estimation for both amplitude and phase of each harmonic currents injection by minimizing the error between the measured values from Phasor Measurement Units (PMUs) and the values computed from the estimated parameters during the estimation process. The proposed algorithm can take into account the uncertainty of the harmonic pseudo measurement and the tolerance in the line impedances of the network as well as uncertainty of the Distributed Generators (DGs) such as Wind Turbines (WT). The main feature of proposed PSO-SA algorithm is to reach quickly around the global optimum by PSO with enabling a mutation function and then to find that optimum by SA searching algorithm. Simulation results on IEEE 34 bus radial and a realistic 70-bus radial test networks are presented to demonstrate the speed and accuracy of proposed Distribution Harmonic State Estimation (DHSE) algorithm is extremely effective and efficient in comparison with the conventional algorithms such as Weight Least Square (WLS), Genetic Algorithm (GA), original PSO and Honey Bees Mating Optimization (HBMO) algorithm.
Resumo:
This paper presents a novel algorithm based on particle swarm optimization (PSO) to estimate the states of electric distribution networks. In order to improve the performance, accuracy, convergence speed, and eliminate the stagnation effect of original PSO, a secondary PSO loop and mutation algorithm as well as stretching function is proposed. For accounting uncertainties of loads in distribution networks, pseudo-measurements is modeled as loads with the realistic errors. Simulation results on 6-bus radial and 34-bus IEEE test distribution networks show that the distribution state estimation based on proposed DLM-PSO presents lower estimation error and standard deviation in comparison with algorithms such as WLS, GA, HBMO, and original PSO.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
We report a new approach that uses the single beam Z-scan technique, to discriminate between excited state absorption (ESA) and two and three photon nonlinear absorption. By measuring the apparent delay or advance of the pulse in reaching the detector, the nonlinear absorption can be unambiguously identified as either instantaneous or transient. The simple method does not require a large range of input fluences or sophisticated pulse-probe experimental apparatus. The technique is easily extended to any absorption process dependent on pulse width and to nonlinear refraction measurements. We demonstrate in particular, that the large nonlinear absorption in ZnO nanocones when exposed to nanosecond 532 nm pulses, is due mostly to ESA, not pure two-photon absorption.
Resumo:
The policies and regulations governing the practice of state asset management have emerged as an urgent question among many countries worldwide for there is heightened awareness of the complex and crucial role that state assets play in public service provision. Indonesia is an example of such country, introducing a ‘big-bang’ reform in state asset management laws, policies, regulations, and technical guidelines. Indonesia exemplified its enthusiasm in reforming state asset management policies and practices through the establishment of the Directorate General of State Assets in 2006. The Directorate General of State Assets have stressed the new direction that it is taking state asset management laws and policies through the introduction of Republic of Indonesia Law Number 38 Year 2008, which is an amended regulation overruling Republic of Indonesia Law Number 6 Year 2006 on Central/Regional Government State Asset Management. Law number 38/2008 aims to further exemplify good governance principles and puts forward a ‘the highest and best use of assets’ principle in state asset management. The purpose of this study is to explore and analyze specific contributing influences to state asset management practices, answering the question why innovative state asset management policy implementation is stagnant. The methodology of this study is that of qualitative case study approach, utilizing empirical data sample of four Indonesian regional governments. Through a thematic analytical approach this study provides an in-depth analysis of each influencing factors to state asset management reform. Such analysis suggests the potential of an ‘excuse rhetoric’; whereby the influencing factors identified are a smoke-screen, or are myths that public policy makers and implementers believe in, as a means to ex-plain stagnant implementation of innovative state asset management practice. Thus this study offers deeper insights of the intricate web that influences state as-set management innovative policies to state asset management policy makers; to be taken into consideration in future policy writing.
Resumo:
Beginning in 1974, the State of Maryland created spatial databases under the MAGI (Maryland's Automated Geographic Information) system. Since that early GIS, other state and local agencies have begun GISs covering a range of applications from critical lands inventories to cadastral mapping. In 1992, state agencies, local agencies, universities, and businesses began a series of GIS coordination activities, resulting in the formation of the Maryland Local Geographic Information Committee and the Maryland State Government Geographic Information Coordinating Committee. GIS activities and system installations can be found in 22 counties plus Baltimore City, and most state agencies. Maryland's decision makers rely on a variety of GIS reports and products to conduct business and to communicate complex issues more effectively. This paper presents the status of Maryland's GIS applications for local and state decision making.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
In order to minimize the number of load shedding in a Microgrid during autonomous operation, islanded neighbour microgrids can be interconnected if they are on a self-healing network and an extra generation capacity is available in Distributed Energy Resources (DER) in one of the microgrids. In this way, the total load in the system of interconnected microgrids can be shared by all the DERs within these microgrids. However, for this purpose, carefully designed self-healing and supply restoration control algorithm, protection systems and communication infrastructure are required at the network and microgrid levels. In this chapter, first a hierarchical control structure is discussed for interconnecting the neighbour autonomous microgrids where the introduced primary control level is the main focus. Through the developed primary control level, it demonstrates how the parallel DERs in the system of multiple interconnected autonomous microgrids can properly share the load in the system. This controller is designed such that the converter-interfaced DERs operate in a voltage-controlled mode following a decentralized power sharing algorithm based on droop control. The switching in the converters is controlled using a linear quadratic regulator based state feedback which is more stable than conventional proportional integrator controllers and this prevents instability among parallel DERs when two microgrids are interconnected. The efficacy of the primary control level of DERs in the system of multiple interconnected autonomous microgrids is validated through simulations considering detailed dynamic models of DERs and converters.
Resumo:
Loop detectors are the oldest and widely used traffic data source. On urban arterials, they are mainly installed for signal control. Recently state of the art Bluetooth MAC Scanners (BMS) has significantly captured the interest of stakeholders for exploiting it for area wide traffic monitoring. Loop detectors provide flow- a fundamental traffic parameter; whereas BMS provides individual vehicle travel time between BMS stations. Hence, these two data sources complement each other, and if integrated should increase the accuracy and reliability of the traffic state estimation. This paper proposed a model that integrates loops and BMS data for seamless travel time and density estimation for urban signalised network. The proposed model is validated using both real and simulated data and the results indicate that the accuracy of the proposed model is over 90%.
Resumo:
The structure of several carboxy-substituted hexahydro-1,4:5,8-diepoxynaphthalenes have been solved with X-ray crystallography, in some cases confirming previously contentious structures. The compounds of interest are constructed in efficient one-step 2 × [4+2] cycloaddition reactions from furan and acetylene carboxylate derivatives.
Resumo:
In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.
Resumo:
Living City 2013 Workshop, as part of a school term’s design-based curriculum connected to the KGSC/QUT Design Excellence Program and run from 11 February – 1 May, 2013, was essentially a three-day place-based urban design immersion workshop program for 25 Year 11 Visual Art and Design Students and 2 Teachers from Kelvin Grove State College (KGSC) held at both Queensland University of Technology (QUT) Gardens Point Campus and The Edge, State Library of Queensland. Mentored by 4 design professionals, 2 tertiary design academics, 2 public artists, and 12 QUT tertiary design students, the workshop explored youth-inspired public space design solutions for the active Brisbane City Council redevelopment site of Queens Wharf Road precinct. As well as the face-to-face workshops, for Living City 2013, an interactive web environment was introduced to enable students to connect with each other and program mentors throughout the course of the program. The workshop, framed within notions of ecological, economic, social and cultural sustainability, aimed to raise awareness of the layered complexity and perspectives involved in the design of shared city spaces and to encourage young people to voice their own concerns as future citizens about the shape and direction of their city. The program commenced with an introductory student briefing by stakeholders and mentors at KGSC on 11 February, an introduction to site appraisal and site visit held at QUT and Queens Wharf Road on 20 February, and a follow up site analysis session on 6 March. Day 1 Workshop on April 17 at the Edge, State Library of Queensland, as part of the Design Minds partnership (http://designminds.org.au/kelvin-grove-state-college-excellence-in-art-design/), focused on mentoring team development of a concept design for a range of selected sites. Two workshops on April 22 and 23 at QUT, to develop these designs and presentation schemes, followed this. The workshop program culminated in a visual presentation of concept design ideas and discussion with a public audience in the Ideas Gallery on The Deck, King George Square during the Brisbane City Council City Centre Master Plan Ideas Fiesta on 1 May, 2013, as referenced in the Ideas Fiesta Wrap-up Report (http://www.brisbane.qld.gov.au/planning-building/planning-guidelines-tools/city-centre-master-plan/city-centre-master-plan-ideas-fiesta). Students were introduced to design methodology, team thinking strategies, the scope of design practices and professions, presentation skills and post-secondary pathways, while participating teachers acquired content and design learning strategies transferable in many other contexts. The program was fully documented on the Living City website (http://www.livingcity.net.au/LC2013x/index.html) and has been recognised by the Brisbane City Council Youth Strategy 2014-2019 as a best practice model for making Brisbane a well-designed, subtropical city.
Resumo:
Transition zones between bridge decks and rail tracks suffer early failure due to poor interaction between rail vehicles and sudden changes of stiffness. This has been an ongoing problem to rail industry and yet still no systematic studies appear to have been taken to maintain a gradually smoothening transmission of forces between the bridge and its approach. Differential settlement between the bridge deck and rail track in the transition zone is the fundamental issue, which negatively impacts the rail industry by causing passenger discomfort, early damage to infrastructure and vehicle components, speed reduction, and frequent maintenance cycles. Identification of mechanism of the track degradation and factors affecting is imperative to design any mitigation method for reducing track degradation rate at the bridge transition zone. Unfortunately this issue is still not well understood, after conducting a numbers of reviews to evaluate the key causes, and introducing a wide range of mitigation techniques. In this study, a comprehensive analysis of the available literature has been carried out to develop either a novel design framework or a mitigation technique for the bridge transition zone. This paper addresses three critical questions in relation to the track degradation at transition zone: (1) what are the causes of bridge transition track degradation?; (2) what are the available mitigation techniques in reducing the track degradation rate?; (3) what are the factors affecting on poor performance of the existing mitigation techniques?. It is found that the absence of soil-water response, dynamic loading response, and behaviour of geotechnical characteristics under long-term conditions in existing track transition design frameworks critically influence on the failures of existing mitigation techniques. This paper also evaluates some of the existing design frameworks to identify how each design framework addresses the track degradation at the bridge transition zone.