16 resultados para confidence level

em Indian Institute of Science - Bangalore - Índia


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Biocards are formal descriptions of biological phenomena and their underlying functional principles. They are used in bioinspired design to document search results and to communicate the findings for use in the further design process. The present study explored the effect of abstraction level used in biocards. This was done in two workshops conducted with design students in Denmark and India. Students were given a design assignment and instructions for how to perform the BID ideation work. Half of the students were given biocards with abstract descriptions while the other half got biocards with concrete descriptions. The novelty of found solutions was evaluated by the students by rating novelty of each solution on a scale from 1 to 5. Mean values for abstract descriptions were 0,3 higher than for concrete descriptions indicating that more innovative solutions were found when students used biocards with abstract descriptions compared to concrete descriptions. The difference in mean value is significant with a confidence level better than 1%. It seems likely that more abstract descriptions in biocards helps avoiding design fixation in biomimetic design work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We utilize top polarization in the process e(+)e(-) -> t (t) over bar at the International Linear Collider ( ILC) with transverse beam polarization to probe interactions of the scalar and tensor type beyond the standard model and to disentangle their individual contributions. Ninety percent confidence level limits on the interactions with realistic integrated luminosity are presented and are found to improve by an order of magnitude compared to the case when the spin of the top quark is not measured. Sensitivities of the order of a few times 10(-3) TeV-2 for real and imaginary parts of both scalar and tensor couplings at root s = 500 and 800 GeV with an integrated luminosity of 500 fb(-1) and completely polarized beams are shown to be possible. A powerful model-independent framework for inclusive measurements is employed to describe the spin-momentum correlations, and their C, P, and T properties are presented in a technical appendix.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work describes an online handwritten character recognition system working in combination with an offline recognition system. The online input data is also converted into an offline image, and parallely recognized by both online and offline strategies. Features are proposed for offline recognition and a disambiguation step is employed in the offline system for the samples for which the confidence level of the classifier is low. The outputs are then combined probabilistically resulting in a classifier out-performing both individual systems. Experiments are performed for Kannada, a South Indian Language, over a database of 295 classes. The accuracy of the online recognizer improves by 11% when the combination with offline system is used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two new statistics, namely Delta(chi 2) and Delta(chi), based on the extreme value theory, were derived by Gupta et al. We use these statistics to study the direction dependence in the HST Key Project data, which provides one of the most precise measurements of the Hubble constant. We also study the non-Gaussianity in this data set using these statistics. Our results for Delta(chi 2) show that the significance of direction-dependent systematics is restricted to well below the 1 sigma confidence limit; however, the presence of non-Gaussian features is subtle. On the other hand, the Delta(chi). statistic, which is more sensitive to direction dependence, shows direction dependence systematics to be at a slightly higher confidence level, and the presence of non-Gaussian features at a level similar to the Delta(chi 2) statistic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We revisit the process e(+)e(-) -> gamma Z at the ILC with transverse beam polarization in the presence of anomalous CP- violating gamma ZZ coupling lambda(1) and gamma gamma Z coupling lambda(2). We point out that if the final- state spins are resolved, then it becomes possible to fingerprint the anomalous coupling Re lambda(1). 90% confidence level limit on Re lambda(1) achievable at ILC with center- of- mass energy of 500 GeVor 800 GeV with realistic initial beam polarization and integrated luminosity is of the order of few times of 10(-2) when the helicity of Z is used and 10(-3) when the helicity of gamma is used. The resulting corrections at quadratic order to the cross section and its influence on these limits are also evaluated and are shown to be small. The benefits of such polarization programmes at the ILC are compared and contrasted for the process at hand. We also discuss possible methods by which one can isolate events with a definite helicity for one of the final- state particles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Detecting and quantifying the presence of human-induced climate change in regional hydrology is important for studying the impacts of such changes on the water resources systems as well as for reliable future projections and policy making for adaptation. In this article a formal fingerprint-based detection and attribution analysis has been attempted to study the changes in the observed monsoon precipitation and streamflow in the rain-fed Mahanadi River Basin in India, considering the variability across different climate models. This is achieved through the use of observations, several climate model runs, a principal component analysis and regression based statistical downscaling technique, and a Genetic Programming based rainfall-runoff model. It is found that the decreases in observed hydrological variables across the second half of the 20th century lie outside the range that is expected from natural internal variability of climate alone at 95% statistical confidence level, for most of the climate models considered. For several climate models, such changes are consistent with those expected from anthropogenic emissions of greenhouse gases. However, unequivocal attribution to human-induced climate change cannot be claimed across all the climate models and uncertainties in our detection procedure, arising out of various sources including the use of models, cannot be ruled out. Changes in solar irradiance and volcanic activities are considered as other plausible natural external causes of climate change. Time evolution of the anthropogenic climate change ``signal'' in the hydrological observations, above the natural internal climate variability ``noise'' shows that the detection of the signal is achieved earlier in streamflow as compared to precipitation for most of the climate models, suggesting larger impacts of human-induced climate change on streamflow than precipitation at the river basin scale.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A generalized top-spin analysis proposed some time ago in the context of the standard model and subsequently studied in varying contexts is now applied primarily to the case of e(+)e(-) -> t (tww) over bar with transversely polarized beams. This extends our recent work with new physics couplings of scalar (S) and tensor (T) types. We carry out a comprehensive analysis assuming only the electron beam to be transversely polarized, which is sufficient to probe these interactions, and also eliminates any azimuthal angular dependence due to the standard model or new physics of the vector (V) and axial-vector (A) type interactions. We then consider new physics of the general four-Fermi type of V and A type with both beams transversely polarized and discuss implications with longitudinal polarization as well. The generalized spin bases are all investigated in the presence of either longitudinal or transverse beam polarization to look for appreciable deviation from the SM prediction in case of the new physics. 90% confidence level limits are obtained on the interactions for the generalized spin bases with realistic integrated luminosity. In order to achieve this we present a general discussion based on helicity amplitudes and derive a general transformation matrix that enables us to treat the spin basis. We find that beamline basis combined with transverse polarization provides an excellent window of opportunity both for S, T and V, A new physics, followed by the off-diagonal basis. The helicity basis is shown to be the best in case of longitudinal polarization to look for new physics effects due to V and A. DOI: 10.1103/PhysRevD.86.114019

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The top polarization at the International Linear Collider (ILC) with transverse beam polarization is utilized in the process to probe interactions of the scalar and tensor type beyond the Standard Model and to disentangle their individual contributions. Confidence level limits of 90% are presented on the interactions with realistic integrated luminosity and are found to improve by an order of magnitude compared to the case when the spin of the top quark is not measured. Sensitivities of the order of a few times 10 (-aEuro parts per thousand 3) TeV (-aEuro parts per thousand 2) for real and imaginary parts of both scalar and tensor couplings at and 800 GeV with an integrated luminosity of 500 fb (-aEuro parts per thousand 1) and completely polarized beams are shown to be possible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thiolases are enzymes involved in lipid metabolism. Thiolases remove the acetyl-CoA moiety from 3-ketoacyl-CoAs in the degradative reaction. They can also catalyze the reverse Claisen condensation reaction, which is the first step of biosynthetic processes such as the biosynthesis of sterols and ketone bodies. In human, six distinct thiolases have been identified. Each of these thiolases is different from the other with respect to sequence, oligomeric state, substrate specificity and subcellular localization. Four sequence fingerprints, identifying catalytic loops of thiolases, have been described. In this study genome searches of two mycobacterial species (Mycobacterium tuberculosis and Mycobacterium smegmatis), were carried out, using the six human thiolase sequences as queries. Eight and thirteen different thiolase sequences were identified in M. tuberculosis and M. smegmatis, respectively. In addition, thiolase-like proteins (one encoded in the Mtb and two in the Msm genome) were found. The purpose of this study is to classify these mostly uncharacterized thiolases and thiolase-like proteins. Several other sequences obtained by searches of genome databases of bacteria, mammals and the parasitic protist family of the Trypanosomatidae were included in the analysis. Thiolase-like proteins were also found in the trypanosomatid genomes, but not in those of mammals. In order to study the phylogenetic relationships at a high confidence level, additional thiolase sequences were included such that a total of 130 thiolases and thiolase-like protein sequences were used for the multiple sequence alignment. The resulting phylogenetic tree identifies 12 classes of sequences, each possessing a characteristic set of sequence fingerprints for the catalytic loops. From this analysis it is now possible to assign the mycobacterial thiolases to corresponding homologues in other kingdoms of life. The results of this bioinformatics analysis also show interesting differences between the distributions of M. tuberculosis and M. smegmatis thiolases over the 12 different classes. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present work presents the results of experimental investigation of semi-solid rheocasting of A356 Al alloy using a cooling slope. The experiments have been carried out following Taguchi method of parameter design (orthogonal array of L-9 experiments). Four key process variables (slope angle, pouring temperature, wall temperature, and length of travel of the melt) at three different levels have been considered for the present experimentation. Regression analysis and analysis of variance (ANOVA) has also been performed to develop a mathematical model for degree of sphericity evolution of primary alpha-Al phase and to find the significance and percentage contribution of each process variable towards the final outcome of degree of sphericity, respectively. The best processing condition has been identified for optimum degree of sphericity (0.83) as A(3), B-3, C-2, D-1 i.e., slope angle of 60 degrees, pouring temperature of 650 degrees C, wall temperature 60 degrees C, and 500 mm length of travel of the melt, based on mean response and signal to noise ratio (SNR). ANOVA results shows that the length of travel has maximum impact on degree of sphericity evolution. The predicted sphericity obtained from the developed regression model and the values obtained experimentally are found to be in good agreement with each other. The sphericity values obtained from confirmation experiment, performed at 95% confidence level, ensures that the optimum result is correct and also the confirmation experiment values are within permissible limits. (c) 2014 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report the results of extensive follow-up observations of the gamma-ray pulsar J1732-3131, which has recently been detected at decametre wavelengths, and the results of deep searches for the counterparts of nine other radio-quiet gamma-ray pulsars at 34 MHz, using the Gauribidanur radio telescope. No periodic signal from J1732-3131 could be detected above a detection threshold of 8 sigma, even with an effective integration time of more than 40 h. However, the average profile obtained by combining data from several epochs, at a dispersion measure of 15.44 pc cm(-3), is found to be consistent with that from the earlier detection of this pulsar at a confidence level of 99.2 per cent. We present this consistency between the two profiles as evidence that J1732-3131 is a faint radio pulsar with an average flux density of 200-400 mJy at 34 MHz. Despite the extremely bright sky background at such low frequencies, the detection sensitivity of our deep searches is generally comparable to that of higher frequency searches for these pulsars, when scaled using reasonable assumptions about the underlying pulsar spectrum. We provide details of our deep searches, and put stringent upper limits on the decametre-wavelength flux densities of several radio-quiet gamma-ray pulsars.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atomic force Microscopy (AFM) has become a versatile tool in biology due to its advantage of high-resolution imaging of biological samples close to their native condition. Apart from imaging, AFM can also measure the local mechanical properties of the surfaces. In this study, we explore the possibility of using AFM to quantify the rough eye phenotype of Drosophila melanogaster through mechanical properties. We have measured adhesion force, stiffness and elastic modulus of the corneal lens using AFM. Various parameters affecting these measurements like cantilever stiffness and tip geometry are systematically studied and the measurement procedures are standardized. Results show that the mean adhesion force of the ommatidial surface varies from 36 nN to 16 nN based on the location. The mean stiffness is 483 +/- 5 N/m, and the elastic modulus is 3.4 +/- 0.05 GPa (95% confidence level) at the center of ommatidia. These properties are found to be different in corneal lens of eye expressing human mutant tau gene (mutant). The adhesion force, stiffness and elastic modulus are decreased in the mutant. We conclude that the measurement of surface and mechanical properties of D. melanogaster using AFM can be used for quantitative evaluation of `rough eye' surface. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper considers decentralized spectrum sensing, i.e., detection of occupancy of the primary users' spectrum by a set of Cognitive Radio (CR) nodes, under a Bayesian set-up. The nodes use energy detection to make their individual decisions, which are combined at a Fusion Center (FC) using the K-out-of-N fusion rule. The channel from the primary transmitter to the CR nodes is assumed to undergo fading, while that from the nodes to the FC is assumed to be error-free. In this scenario, a novel concept termed as the Error Exponent with a Confidence Level (EECL) is introduced to evaluate and compare the performance of different detection schemes. Expressions for the EECL under general fading conditions are derived. As a special case, it is shown that the conventional error exponent both at individual sensors, and at the FC is zero. Further, closed-form lower bounds on the EECL are derived under Rayleigh fading and lognormal shadowing. As an example application, it answers the question of whether to use pilot-signal based narrowband sensing, where the signal undergoes Rayleigh fading, or to sense over the entire bandwidth of a wideband signal, where the signal undergoes lognormal shadowing. Theoretical results are validated using Monte Carlo simulations. (C) 2015 Elsevier B.V. All rights reserved.