9 resultados para instantaneous complex power

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a study of police interviewing using an integrated approach, drawing on CA, CDA and pragmatics. The study focuses on the balance of power and control, finding that in particular the institutional status of the participants, the discursive roles assigned to them by the context, and their relative knowledge, are significant factors affecting the dynamics of the discourse. Four discursive features are identified as particularly significant, and a detailed analysis of the complex interplay of these features shows that power and control are constantly under negotiation, and are always open to challenge and resistance. Further it is shown that discursive dominance is not necessarily advantageous to participants, due to the specific goals and purposes of the police interview context. A wider consideration of the context illustrates the contribution that linguistics can make to the use of police interview data as evidence in the UK criminal justice system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article analyses the complex process that deracialised and democratised South African football between the early 1970s and 1990s. Based mainly on archival documents, it argues that growing isolation from world sport, exemplified by South Africa's expulsion from the Olympic movement in 1970 and FIFA in 1976, and the reinvigoration of the liberation struggle with the Soweto youth uprising triggered a process of gradual desegregation in the South African professional game. While Pretoria viewed such changes as a potential bulwark against rising black militancy, white football and big business had their own reasons for eventually supporting racial integration, as seen in the founding of the National Soccer League. As negotiations for a new democratic South Africa began in earnest between the African National Congress (ANC) and the National Party (NP) in the latter half of the 1980s, transformations in football and politics paralleled and informed each other. Previously antagonistic football associations began a series of 'unity talks' between 1985 and 1986 that eventually culminated in the formation of a single, non-racial South African Football Association in December 1991, just a few days before the Convention for a Democratic South Africa (CODESA) opened the process of writing a new post-apartheid constitution. Finally, three decades of isolation came to an end as FIFA welcomed South Africa back into world football in 1992 - a powerful example of the seemingly boundless potential of a liberated and united South Africa ahead of the first democratic elections in 1994.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the sample size required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for the error term of the ANOVA testing effects of particular interest. Finally, it is important to always consider the design of the experiment because this determines the appropriate ANOVA to use. Hence, it is necessary to be able to identify the different forms of ANOVA appropriate to different experimental designs and to recognise when a design is a split-plot or incorporates a repeated measure. If there is any doubt about which ANOVA to use in a specific circumstance, the researcher should seek advice from a statistician with experience of research in applied microbiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theoretical model is developed which characterizes the intracavity pulse evolutions in high-power fiber lasers. It is shown that experimentally observed dynamics of the key pulse parameters can be described by a reduced model of ordinary differential equations. Critical in driving the intracavity dynamics is the amplitude and phase modulations generated by the discrete elements in the laser. The theory gives a simple geometrical description of the intracavity dynamics and possible operation modes of the laser cavity. Furthermore, it provides a simple and efficient method for optimizing the performance of complex multiparametric laser systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The deoxidation of steel with complex deoxidisers was studied at 1550°C and compared with silicon, aluminium and silicon/aluminium alloys as standards. The deoxidation alloy systems, Ca/Si/Al, Mg/Si/Al and Mn/Si/Al, were chosen for the low liquidus temperatures of many of their oxide mixtures and the potential deoxidising power of their constituent elements. Product separation rates and compositional relationships following deoxidation were examined. Silicon/aluminium alloy deoxidation resulted in the product compositions and residual oxygen contents expected from equilibrium and stoichiometric considerations, but with the Ca/Si/Al and Mg/Si/Al alloys the volatility of calcium and magnesium prevented them participating in the final solute equilibrium, despite their reported solubility in liquid iron. Electron-probe microanalysis of the products showed various concentrations of lime and magnesia, possibly resulting from reaction between the metal vapours and dissolved oxygen.The consequent reduction of silica activity in the products due to the presence of CaO and hgO produced an indirect effect of calcium and magnesium on the residual oxygen content. Product separation rates, indicated by vacuum fusion analyses, were not significantly influenced by calcium and magnesium but the rapid separation of products having a high Al2O3Si02 ratio was confirmed. Manganese participated in deoxidation, when present either as an alloying element in the steel or as a deoxidation alloy constituent. The compositions of initial oxide products were related to deoxidation alloy compositions. Separated products which were not alumina saturated, dissolved crucible material to achieve saturation. The melt equilibrated with this slag and crucible by diffusion to determine the residual oxygen content. MnO and SiO2 activities were calculated, and the approximate values of MnO deduced for the compositions obtained. Separation rates were greater for products of high interfacial tension. The rates calculated from a model based on Stoke's Law, showed qualitative agreement with experimental data when corrected for coalescence effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since wind at the earth's surface has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safe and economic use of wind energy. In this paper, we investigated a combination of numeric and probabilistic models: a Gaussian process (GP) combined with a numerical weather prediction (NWP) model was applied to wind-power forecasting up to one day ahead. First, the wind-speed data from NWP was corrected by a GP, then, as there is always a defined limit on power generated in a wind turbine due to the turbine controlling strategy, wind power forecasts were realized by modeling the relationship between the corrected wind speed and power output using a censored GP. To validate the proposed approach, three real-world datasets were used for model training and testing. The empirical results were compared with several classical wind forecast models, and based on the mean absolute error (MAE), the proposed model provides around 9% to 14% improvement in forecasting accuracy compared to an artificial neural network (ANN) model, and nearly 17% improvement on a third dataset which is from a newly-built wind farm for which there is a limited amount of training data. © 2013 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since wind has an intrinsically complex and stochastic nature, accurate wind power forecasts are necessary for the safety and economics of wind energy utilization. In this paper, we investigate a combination of numeric and probabilistic models: one-day-ahead wind power forecasts were made with Gaussian Processes (GPs) applied to the outputs of a Numerical Weather Prediction (NWP) model. Firstly the wind speed data from NWP was corrected by a GP. Then, as there is always a defined limit on power generated in a wind turbine due the turbine controlling strategy, a Censored GP was used to model the relationship between the corrected wind speed and power output. To validate the proposed approach, two real world datasets were used for model construction and testing. The simulation results were compared with the persistence method and Artificial Neural Networks (ANNs); the proposed model achieves about 11% improvement in forecasting accuracy (Mean Absolute Error) compared to the ANN model on one dataset, and nearly 5% improvement on another.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in renewable energy generators introduced into the electricity grid is putting pressure on its stability and management as predictions of renewable energy sources cannot be accurate or fully controlled. This, with the additional pressure of fluctuations in demand, presents a problem more complex than the current methods of controlling electricity distribution were designed for. A global approximate and distributed optimisation method for power allocation that accommodates uncertainties and volatility is suggested and analysed. It is based on a probabilistic method known as message passing [1], which has deep links to statistical physics methodology. This principled method of optimisation is based on local calculations and inherently accommodates uncertainties; it is of modest computational complexity and provides good approximate solutions.We consider uncertainty and fluctuations drawn from a Gaussian distribution and incorporate them into the message-passing algorithm. We see the effect that increasing uncertainty has on the transmission cost and how the placement of volatile nodes within a grid, such as renewable generators or consumers, effects it.