135 resultados para The Well-Tempered Clavier


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The well-established under-frequency load shedding (UFLS) is deemed to be the last of effective remedial measures against a severe frequency decline of a power system. With the ever-increasing size of power systems and the extensive penetration of distributed generators (DGs) in power systems, the problem of developing an optimal UFLS strategy is facing some new challenges. Given this background, an optimal UFLS strategy for a distribution system with DGs and load static characteristics taken into consideration is developed. Based on the frequency and the rate of change of frequency, the presented strategy consists of several basic rounds and a special round. In the basic round, the frequency emergency can be alleviated by quickly shedding some loads. In the special round, the frequency security can be maintained, and the operating parameters of the distribution system can be optimized by adjusting the output powers of DGs and some loads. The modified IEEE 37-node test feeder is employed to demonstrate the essential features of the developed optimal UFLS strategy in the MATLAB/SIMULINK environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, some models have been proposed for the fault section estimation and state identification of unobserved protective relays (FSE-SIUPR) under the condition of incomplete state information of protective relays. In these models, the temporal alarm information from a faulted power system is not well explored although it is very helpful in compensating the incomplete state information of protective relays, quickly achieving definite fault diagnosis results and evaluating the operating status of protective relays and circuit breakers in complicated fault scenarios. In order to solve this problem, an integrated optimization mathematical model for the FSE-SIUPR, which takes full advantage of the temporal characteristics of alarm messages, is developed in the framework of the well-established temporal constraint network. With this model, the fault evolution procedure can be explained and some states of unobserved protective relays identified. The model is then solved by means of the Tabu search (TS) and finally verified by test results of fault scenarios in a practical power system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an architecture for robotic telepresence and teleoperation based on the well known tools ROS and Skype. We discuss how Skype can be used as a framework for robotic communication and can be integrated into a ROS/Linux framework to allow a remote user to not only interact with people near the robot, but to view maps, sensory data, robot pose and to issue commands to the robot’s navigation stack. This allows the remote user to exploit the robot’s autonomy, providing a much more convenient navigation interface than simple remote joysticking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The three genera of smut fungi, Ustilago, Sporisorium and Macalpinomyces, form a complex that has eluded resolution by morphology (Langdon & Fullerton 1975, Vánky 1991, Piepenbring et al. 1998) and molecular phylogenetic analysis (Stoll et al. 2003, 2005). Two suggestions to reconcile the taxonomy of the complex have been proposed. The first was to break up the current taxa into several smaller genera and subgenera, and the second to unify the three genera into a single genus, Ustilago (Vánky 2002, Piepenbring 2004). The former solution is dependent on finding morphological synapomorphies that can delimit the genera, and the latter solution dismisses the wide morphological diversity within the group (McTaggart et al. 2012b). Synapomorphic morphological characters and host plant classification delimited clades in the Ustilago-Sporisorium-Macalpinomyces complex (McTaggart et al. 2012a). The current study defines these synapomorphic characters and proposes a new classification for many species currently placed in Ustilago, Sporisorium and Macalpinomyces. This approach preserves the well-known genera Ustilago, Sporisorium and Macalpinomyces, and enables the classification to reflect morphological diversity in the complex.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Samples of drugs are often given to doctors by pharmaceutical representatives as part of a marketing strategy. Despite the well described advantages of drug samples, little has been published on the potential adverse outcomes. A series of consumer calls to the Adverse Medicine Events Line has highlighted concerns regarding the quality use of medicines associated with drug samples. The most commonly reported problems were drug samples being supplied to patients with inadequate information regarding dosage, administration, storage and possible adverse effects. In addition, some patients were given excessive quantities of a drug. To reduce such adverse outcomes, the drug industry, health professionals and consumers should be aware of the potential problems associated with starter packs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of security models have been proposed for RFID systems. Recent studies show that current models tend to be limited in the number of properties they capture. Consequently, models are commonly unable to distinguish between protocols with regard to finer privacy properties. This paper proposes a privacy model that introduces previously unavailable expressions of privacy. Based on the well-studied notion of indistinguishability, the model also strives to be simpler, easier to use, and more intuitive compared to previous models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of security models have been proposed for RFID systems. Recent studies show that current models tend to be limited in the number of properties they capture. Consequently, models are commonly unable to distinguish between protocols with regard to finer privacy properties. This paper proposes a privacy model that introduces previously unavailable expressions of privacy. Based on the well-studied notion of indistinguishability, the model also strives to be simpler, easier to use, and more intuitive compared to previous models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this chapter we make assumptions about the primary role of education for the life of its beneficiaries and for society. Undoubtedly, formal education plays an important role in enhancing the likelihood for participation in future social life, including enjoyment and employment, by the student as well as the development of the well being of society in general. Similarly, education is often seen as a main means for intergenerational transmission of knowledge and culture. However, as Dewey (1916) argues, in liberal societies, education has the capacity of enhancing democratic participation in society that goes beyond passive participation by its members. One can argue that the achievement of the ideals of democracy demands a free and strong education system. In other words, while education can function as an instrument to integrate students into the present society, it also has the potential to become an instrument for its transformation by means of which citizens can develop an understanding of how their society functions and a sense of agency towards its transformation. Arguably, this is what Freire (1985) meant when he talked about the role of education to “read and write” the world. A stream of progressive educators (e.g., Apple (2004), Freire, (1985), Giroux (2001) and McLaren (2002)) taught us that the reading of the world that is capable of leading into writing the world is a critical reading; i.e., a reading that poses “Why” questions and imagines “What else can be” (Carr & Kemmis, 1987).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existence of the Macroscopic Fundamental Diagram (MFD), which relates network space-mean density and flow, has been shown in urban networks under homogeneous traffic conditions. Since the MFD represents the area-wide network traffic performances, studies on perimeter control strategies and an area traffic state estimation utilizing the MFD concept has been reported. The key requirements for the well-defined MFD is the homogeneity of the area wide traffic condition, which is not universally expected in real world. For the practical application of the MFD concept, several researchers have identified the influencing factors for network homogeneity. However, they did not explicitly take drivers’ behaviour under real time information provision into account, which has a significant impact on the shape of the MFD. This research aims to demonstrate the impact of drivers’ route choice behaviour on network performance by employing the MFD as a measurement. A microscopic simulation is chosen as an experimental platform. By changing the ratio of en-route informed drivers and pre-trip informed drivers as well as by taking different route choice parameters, various scenarios are simulated in order to investigate how drivers’ adaptation to the traffic congestion influences the network performance and the MFD shape. This study confirmed and addressed the impact of information provision on the MFD shape and highlighted the significance of the route choice parameter setting as an influencing factor in the MFD analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The well-known power system stabilizer (PSS) is used to generate supplementary control signals for the excitation system of a generator so as to damp low frequency oscillations in the power system concerned. Up to now, various kinds of PSS design methods have been proposed and some of them applied in actual power systems with different degrees. Given this background, the small-disturbance eigenvalue analysis and large-disturbance dynamic simulations in the time domain are carried out to evaluate the performances of four different PSS design methods, including the Conventional PSS (CPSS), Single-Neuron PSS (SNPSS), Adaptive PSS (APSS) and Multi-band PSS (MBPSS). To make the comparisons equitable, the parameters of the four kinds of PSSs are all determined by the steepest descent method. Finally, an 8-unit 24-bus power system is employed to demonstrate the performances of the four kinds of PSSs by the well-established eigenvalue analysis as well as numerous digital simulations, and some useful conclusions obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Particulate matter research is essential because of the well known significant adverse effects of aerosol particles on human health and the environment. In particular, identification of the origin or sources of particulate matter emissions is of paramount importance in assisting efforts to control and reduce air pollution in the atmosphere. This thesis aims to: identify the sources of particulate matter; compare pollution conditions at urban, rural and roadside receptor sites; combine information about the sources with meteorological conditions at the sites to locate the emission sources; compare sources based on particle size or mass; and ultimately, provide the basis for control and reduction in particulate matter concentrations in the atmosphere. To achieve these objectives, data was obtained from assorted local and international receptor sites over long sampling periods. The samples were analysed using Ion Beam Analysis and Scanning Mobility Particle Sizer methods to measure the particle mass with chemical composition and the particle size distribution, respectively. Advanced data analysis techniques were employed to derive information from large, complex data sets. Multi-Criteria Decision Making (MCDM), a ranking method, drew on data variability to examine the overall trends, and provided the rank ordering of the sites and years that sampling was conducted. Coupled with the receptor model Positive Matrix Factorisation (PMF), the pollution emission sources were identified and meaningful information pertinent to the prioritisation of control and reduction strategies was obtained. This thesis is presented in the thesis by publication format. It includes four refereed papers which together demonstrate a novel combination of data analysis techniques that enabled particulate matter sources to be identified and sampling site/year ranked. The strength of this source identification process was corroborated when the analysis procedure was expanded to encompass multiple receptor sites. Initially applied to identify the contributing sources at roadside and suburban sites in Brisbane, the technique was subsequently applied to three receptor sites (roadside, urban and rural) located in Hong Kong. The comparable results from these international and national sites over several sampling periods indicated similarities in source contributions between receptor site-types, irrespective of global location and suggested the need to apply these methods to air pollution investigations worldwide. Furthermore, an investigation into particle size distribution data was conducted to deduce the sources of aerosol emissions based on particle size and elemental composition. Considering the adverse effects on human health caused by small-sized particles, knowledge of particle size distribution and their elemental composition provides a different perspective on the pollution problem. This thesis clearly illustrates that the application of an innovative combination of advanced data interpretation methods to identify particulate matter sources and rank sampling sites/years provides the basis for the prioritisation of future air pollution control measures. Moreover, this study contributes significantly to knowledge based on chemical composition of airborne particulate matter in Brisbane, Australia and on the identity and plausible locations of the contributing sources. Such novel source apportionment and ranking procedures are ultimately applicable to environmental investigations worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.