965 resultados para Separability Criterion
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
Objective: To evaluate whether including children with onset of symptoms between ages 7 and 12 years in the ADHD diagnostic category would: (a) increase the prevalence of the disorder at age 12, and (b) change the clinical and cognitive features, impairment profile, and risk factors for ADHD compared with findings in the literature based on the DSM-IV definition of the disorder. Method: A birth cohort of 2,232 British children was prospectively evaluated at ages 7 and 12 years for ADHD using information from mothers and teachers. The prevalence of diagnosed ADHD at age 12 was evaluated with and without the inclusion of individuals who met DSM-IV age-of-onset criterion through mothers` or teachers` reports of symptoms at age 7. Children with onset of ADHD symptoms before versus after age 7 were compared on their clinical and cognitive features, impairment profile, and risk factors for ADHD. Results: Extending the age-of-onset criterion to age 12 resulted in a negligible increase in ADHD prevalence by age 12 years of 0.1%. Children who first manifested ADHD symptoms between ages 7 and 12 did not present correlates or risk factors that were significantly different from children who manifested symptoms before age 7. Conclusions: Results from this prospective birth cohort might suggest that adults who are able to report symptom onset by age 12 also had symptoms by age 7, even if they are not able to report them. The data suggest that the prevalence estimate, correlates and risk factors of ADHD will not be affected if the new diagnostic scheme extends the age-of-onset criterion to age 12. J. Am. Acad. Child Adolesc. Psychiatry, 2010;49(3):210-216.
Resumo:
A remarkable feature of quantum entanglement is that an entangled state of two parties, Alice (A) and Bob (B), may be more disordered locally than globally. That is, S(A) > S(A, B), where S() is the von Neumann entropy. It is known that satisfaction of this inequality implies that a state is nonseparable. In this paper we prove the stronger result that for separable states the vector of eigenvalues of the density matrix of system AB is majorized by the vector of eigenvalues of the density matrix of system A alone. This gives a strong sense in which a separable state is more disordered globally than locally and a new necessary condition for separability of bipartite states in arbitrary dimensions.
Resumo:
Frame rate upconversion (FRUC) is an important post-processing technique to enhance the visual quality of low frame rate video. A major, recent advance in this area is FRUC based on trilateral filtering which novelty mainly derives from the combination of an edge-based motion estimation block matching criterion with the trilateral filter. However, there is still room for improvement, notably towards reducing the size of the uncovered regions in the initial estimated frame, this means the estimated frame before trilateral filtering. In this context, proposed is an improved motion estimation block matching criterion where a combined luminance and edge error metric is weighted according to the motion vector components, notably to regularise the motion field. Experimental results confirm that significant improvements are achieved for the final interpolated frames, reaching PSNR gains up to 2.73 dB, on average, regarding recent alternative solutions, for video content with varied motion characteristics.
Resumo:
Research on cluster analysis for categorical data continues to develop, new clustering algorithms being proposed. However, in this context, the determination of the number of clusters is rarely addressed. We propose a new approach in which clustering and the estimation of the number of clusters is done simultaneously for categorical data. We assume that the data originate from a finite mixture of multinomial distributions and use a minimum message length criterion (MML) to select the number of clusters (Wallace and Bolton, 1986). For this purpose, we implement an EM-type algorithm (Silvestre et al., 2008) based on the (Figueiredo and Jain, 2002) approach. The novelty of the approach rests on the integration of the model estimation and selection of the number of clusters in a single algorithm, rather than selecting this number based on a set of pre-estimated candidate models. The performance of our approach is compared with the use of Bayesian Information Criterion (BIC) (Schwarz, 1978) and Integrated Completed Likelihood (ICL) (Biernacki et al., 2000) using synthetic data. The obtained results illustrate the capacity of the proposed algorithm to attain the true number of cluster while outperforming BIC and ICL since it is faster, which is especially relevant when dealing with large data sets.
Resumo:
We assess the performance of Gaussianity tests, namely the Anscombe-Glynn, Lilliefors, Cramér-von Mises, and Giannakis-Tsatsanis (G-T), with the purpose of detecting narrowband and wideband interference in GNSS signals. Simulations have shown that the G-T test outperforms the others being suitable as a benchmark for comparison with different types of interference detection algorithms. © 2014 EURASIP.
Resumo:
Risk management is of paramount importance in the success of tunnelling works and is linked to the tunnelling method and to the constraints of the works. Sequencial Excavation Method (SEM) and Tun-nel Boring Machine (TBM) method have been competing for years. This article, part of a wider study on the influence of the â Safety and Healthâ criterion in the choice of method, reviews the existing literature about the criteria usually employed to choose the tunnelling method and on the criterion â Safety and Healthâ . This crite-rion is particularly important, due to the financial impacts of work accidents and occupational diseases. This article is especially useful to the scientific and technical community, since it synthesizes the relevance of each one of the choice criteria used and it shows why â Safety and Healthâ must be a criterion in the decision mak-ing process to choose the tunnelling method.
Resumo:
We consider the problem of allocating an infinitely divisible commodity among a group of agents with single-peaked preferences. A rule that has played a central role in the analysis of the problem is the so-called uniform rule. Chun (2001) proves that the uniform rule is the only rule satisfying Pareto optimality, no-envy, separability, and continuity (with respect to the social endowment). We obtain an alternative characterization by using a weak replication-invariance condition, called duplication-invariance, instead of continuity. Furthermore, we prove that Pareto optimality, equal division lower bound, and separability imply no-envy. Using this result, we strengthen one of Chun's (2001) characterizations of the uniform rule by showing that the uniform rule is the only rule satisfying Pareto optimality, equal división lower bound, separability, and either continuity or duplication-invariance.
Resumo:
We study the profinite topology on discrete groups and in particular the property of cyclic subgroup separability. We investigate the class of quasi-potent, cyclic subgroup separable groups, producing many examples and showing how it behaves with respect to certain group constructions.
Resumo:
We prove a criterion for the irreducibility of an integral group representation p over the fraction field of a noetherian domain R in terms of suitably defined reductions of p at prime ideals of R. As applications, we give irreducibility results for universal deformations of residual representations, with a special attention to universal deformations of residual Galois representations associated with modular forms of weight at least 2.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Resumo:
In their safety evaluations of bisphenol A (BPA), the U.S. Food and Drug Administration (FDA) and a counterpart in Europe, the European Food Safety Authority (EFSA), have given special prominence to two industry-funded studies that adhered to standards defined by Good Laboratory Practices (GLP). These same agencies have given much less weight in risk assessments to a large number of independently replicated non-GLP studies conducted with government funding by the leading experts in various fields of science from around the world. OBJECTIVES: We reviewed differences between industry-funded GLP studies of BPA conducted by commercial laboratories for regulatory purposes and non-GLP studies conducted in academic and government laboratories to identify hazards and molecular mechanisms mediating adverse effects. We examined the methods and results in the GLP studies that were pivotal in the draft decision of the U.S. FDA declaring BPA safe in relation to findings from studies that were competitive for U.S. National Institutes of Health (NIH) funding, peer-reviewed for publication in leading journals, subject to independent replication, but rejected by the U.S. FDA for regulatory purposes. DISCUSSION: Although the U.S. FDA and EFSA have deemed two industry-funded GLP studies of BPA to be superior to hundreds of studies funded by the U.S. NIH and NIH counterparts in other countries, the GLP studies on which the agencies based their decisions have serious conceptual and methodologic flaws. In addition, the U.S. FDA and EFSA have mistakenly assumed that GLP yields valid and reliable scientific findings (i.e., "good science"). Their rationale for favoring GLP studies over hundreds of publically funded studies ignores the central factor in determining the reliability and validity of scientific findings, namely, independent replication, and use of the most appropriate and sensitive state-of-the-art assays, neither of which is an expectation of industry-funded GLP research. CONCLUSIONS: Public health decisions should be based on studies using appropriate protocols with appropriate controls and the most sensitive assays, not GLP. Relevant NIH-funded research using state-of-the-art techniques should play a prominent role in safety evaluations of chemicals.
Resumo:
The second differential of the entropy is used for analysing the stability of a thermodynamic climatic model. A delay time for the heat flux is introduced whereby it becomes an independent variable. Two different expressions for the second differential of the entropy are used: one follows classical irreversible thermodynamics theory; the second is related to the introduction of response time and is due to the extended irreversible thermodynamics theory. the second differential of the classical entropy leads to unstable solutions for high values of delay times. the extended expression always implies stable states for an ice-free earth. When the ice-albedo feedback is included, a discontinuous distribution of stable states is found for high response times. Following the thermodynamic analysis of the model, the maximum rates of entropy production at the steady state are obtained. A latitudinally isothermal earth produces the extremum in global entropy production. the material contribution to entropy production (by which we mean the production of entropy by material transport of heat) is a maximum when the latitudinal distribution of temperatures becomes less homogeneous than present values