1000 resultados para balanced testing
Resumo:
A novel optical add-drop multiplexer (OADM) based on the Mach-Zelauler interferometer (MZI) and the fiber Bragg grating (FBG) is proposed for the first tittle to the authors ' knowledge. In the structure, the Mach-Zehnder interferometer acts as an optical switch. The principle of the OADM is analyzed in this paper. The OADM can add/drop one of the multi-input channels or pass the channel directly by adjusting the difference of the two arms of the interferometer. The channel isolation is more than 20 dB
Resumo:
The motivatitni for" the present work is from .a project sanctioned by TSRO. The work involved the development of a quick and reliable test procedure using microwaves, for tflue inspection of cured propellant samples and a method to monitor the curing conditions of propellant mix undergoing the curing process.Normal testing CHE the propellant samples involvecuttimg a piece from each carton and testing it for their tensile strength. The values are then compared with standard ones and based on this result the sample isaccepted or rejected. The tensile strength is a measure ofdegree of cure of the propellant mix. But this measurementis a destructive procedure as it involves cutting of the sample. Moreover, it does not guarantee against nonuniform curing due to power failure, hot air-line failure,operator error etc. This necessitated the need for the development of a quick and reliable non-destructive test procedure.
Resumo:
A graph G is strongly distance-balanced if for every edge uv of G and every i 0 the number of vertices x with d.x; u/ D d.x; v/ 1 D i equals the number of vertices y with d.y; v/ D d.y; u/ 1 D i. It is proved that the strong product of graphs is strongly distance-balanced if and only if both factors are strongly distance-balanced. It is also proved that connected components of the direct product of two bipartite graphs are strongly distancebalanced if and only if both factors are strongly distance-balanced. Additionally, a new characterization of distance-balanced graphs and an algorithm of time complexity O.mn/ for their recognition, wheremis the number of edges and n the number of vertices of the graph in question, are given
Resumo:
Given a graph G and a set X ⊆ V(G), the relative Wiener index of X in G is defined as WX (G) = {u,v}∈X 2 dG(u, v) . The graphs G (of even order) in which for every partition V(G) = V1 +V2 of the vertex set V(G) such that |V1| = |V2| we haveWV1 (G) = WV2 (G) are called equal opportunity graphs. In this note we prove that a graph G of even order is an equal opportunity graph if and only if it is a distance-balanced graph. The latter graphs are known by several characteristic properties, for instance, they are precisely the graphs G in which all vertices u ∈ V(G) have the same total distance DG(u) = v∈V(G) dG(u, v). Some related problems are posed along the way, and the so-called Wiener game is introduced.
Resumo:
One comes across directions as the observations in a number of situations. The first inferential question that one should answer when dealing with such data is, “Are they isotropic or uniformly distributed?” The answer to this question goes back in history which we shall retrace a bit and provide an exact and approximate solution to this so-called “Pearson’s Random Walk” problem.
Resumo:
Relativistic density functional theory is widely applied in molecular calculations with heavy atoms, where relativistic and correlation effects are on the same footing. Variational stability of the Dirac Hamiltonian is a very important field of research from the beginning of relativistic molecular calculations on, among efforts for accuracy, efficiency, and density functional formulation, etc. Approximations of one- or two-component methods and searching for suitable basis sets are two major means for good projection power against the negative continuum. The minimax two-component spinor linear combination of atomic orbitals (LCAO) is applied in the present work for both light and super-heavy one-electron systems, providing good approximations in the whole energy spectrum, being close to the benchmark minimax finite element method (FEM) values and without spurious and contaminated states, in contrast to the presence of these artifacts in the traditional four-component spinor LCAO. The variational stability assures that minimax LCAO is bounded from below. New balanced basis sets, kinetic and potential defect balanced (TVDB), following the minimax idea, are applied with the Dirac Hamiltonian. Its performance in the same super-heavy one-electron quasi-molecules shows also very good projection capability against variational collapse, as the minimax LCAO is taken as the best projection to compare with. The TVDB method has twice as many basis coefficients as four-component spinor LCAO, which becomes now linear and overcomes the disadvantage of great time-consumption in the minimax method. The calculation with both the TVDB method and the traditional LCAO method for the dimers with elements in group 11 of the periodic table investigates their difference. New bigger basis sets are constructed than in previous research, achieving high accuracy within the functionals involved. Their difference in total energy is much smaller than the basis incompleteness error, showing that the traditional four-spinor LCAO keeps enough projection power from the numerical atomic orbitals and is suitable in research on relativistic quantum chemistry. In scattering investigations for the same comparison purpose, the failure of the traditional LCAO method of providing a stable spectrum with increasing size of basis sets is contrasted to the TVDB method, which contains no spurious states already without pre-orthogonalization of basis sets. Keeping the same conditions including the accuracy of matrix elements shows that the variational instability prevails over the linear dependence of the basis sets. The success of the TVDB method manifests its capability not only in relativistic quantum chemistry but also for scattering and under the influence of strong external electronic and magnetic fields. The good accuracy in total energy with large basis sets and the good projection property encourage wider research on different molecules, with better functionals, and on small effects.
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes. Against this background the relevance of perturbations and subcompositions can be clearly seen. Moreover we can identify a number of hypotheses of interest involving the specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
The main instrument used in psychological measurement is the self-report questionnaire. One of its major drawbacks however is its susceptibility to response biases. A known strategy to control these biases has been the use of so-called ipsative items. Ipsative items are items that require the respondent to make between-scale comparisons within each item. The selected option determines to which scale the weight of the answer is attributed. Consequently in questionnaires only consisting of ipsative items every respondent is allotted an equal amount, i.e. the total score, that each can distribute differently over the scales. Therefore this type of response format yields data that can be considered compositional from its inception. Methodological oriented psychologists have heavily criticized this type of item format, since the resulting data is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians have kept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate both positions and addresses the similarities and differences between the two data collection methods. The ultimate objective is to formulate a guideline when to use which type of item format. The comparison is based on data obtained with both an ipsative and normative version of three psychological questionnaires, which were administered to 502 first-year students in psychology according to a balanced within-subjects design. Previous research only compared the direct ipsative scale scores with the derived ipsative scale scores. The use of compositional data analysis techniques also enables one to compare derived normative score ratios with direct normative score ratios. The addition of the second comparison not only offers the advantage of a better-balanced research strategy. In principle it also allows for parametric testing in the evaluation
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
I test the presence of hidden information and action in the automobile insurance market using a data set from several Colombian insurers. To identify the presence of hidden information I find a common knowledge variable providing information on policyholder s risk type which is related to both experienced risk and insurance demand and that was excluded from the pricing mechanism. Such unused variable is the record of policyholder s traffic offenses. I find evidence of adverse selection in six of the nine insurance companies for which the test is performed. From the point of view of hidden action I develop a dynamic model of effort in accident prevention given an insurance contract with bonus experience rating scheme and I show that individual accident probability decreases with previous accidents. This result brings a testable implication for the empirical identification of hidden action and based on that result I estimate an econometric model of the time spans between the purchase of the insurance and the first claim, between the first claim and the second one, and so on. I find strong evidence on the existence of unobserved heterogeneity that deceives the testable implication. Once the unobserved heterogeneity is controlled, I find conclusive statistical grounds supporting the presence of moral hazard in the Colombian insurance market.
Resumo:
PowerPoint slides for Hypothesis Testing. Examples are taken from the Medical Literature