50 resultados para Missing Covariates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel second order cone programming formulation for designing robust classifiers which can handle uncertainty in observations. Similar formulations are also derived for designing regression functions which are robust to uncertainties in the regression setting. The proposed formulations are independent of the underlying distribution, requiring only the existence of second order moments. These formulations are then specialized to the case of missing values in observations for both classification and regression problems. Experiments show that the proposed formulations outperform imputation.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we address the reconstruction problem from laterally truncated helical cone-beam projections. The reconstruction problem from lateral truncation, though similar to that of interior radon problem, is slightly different from it as well as the local (lambda) tomography and pseudo-local tomography in the sense that we aim to reconstruct the entire object being scanned from a region-of-interest (ROI) scan data. The method proposed in this paper is a projection data completion approach followed by the use of any standard accurate FBP type reconstruction algorithm. In particular, we explore a windowed linear prediction (WLP) approach for data completion and compare the quality of reconstruction with the linear prediction (LP) technique proposed earlier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Himalayas are presently holding the largest ice masses outside the polar regions and thus (temporarily) store important freshwater resources. In contrast to the contemplation of glaciers, the role of runoff from snow cover has received comparably little attention in the past, although (i) its contribution is thought to be at least equally or even more important than that of ice melt in many Himalayan catchments and (ii) climate change is expected to have widespread and significant consequences on snowmelt runoff. Here, we show that change assessment of snowmelt runoff and its timing is not as straightforward as often postulated, mainly as larger partial pressure of H2O, CO2, CH4, and other greenhouse gases might increase net long-wave input for snowmelt quite significantly in a future atmosphere. In addition, changes in the short-wave energy balance such as the pollution of the snow cover through black carbon or the sensible or latent heat contribution to snowmelt are likely to alter future snowmelt and runoff characteristics as well. For the assessment of snow cover extent and depletion, but also for its monitoring over the extremely large areas of the Himalayas, remote sensing has been used in the past and is likely to become even more important in the future. However, for the calibration and validation of remotely-sensed data, and even-more so in light of possible changes in snow-cover energy balance, we strongly call for more in-situ measurements across the Himalayas, in particular for daily data on new snow and snow cover water equivalent, or the respective energy balance components. Moreover, data should be made accessible to the scientific community, so that the latter can more accurately estimate climate change impacts on Himalayan snow cover and possible consequences thereof on runoff. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resonant sensors and crystal oscillators for mass detection need to be excited at very high natural frequencies (MHz). Use of such systems to measure mass of biological materials affects the accuracy of mass measurement due to their viscous and/or viscoelastic properties. The measurement limitation of such sensor system is the difficulty in accounting for the ``missing mass'' of the biological specimen in question. A sensor system has been developed in this work, to be operated in the stiffness controlled region at very low frequencies as compared to its fundamental natural frequency. The resulting reduction in the sensitivity due to non-resonant mode of operation of this sensor is compensated by the high resolution of the sensor. The mass of different aged drosophila melanogaster (fruit fly) is measured. The difference in its mass measurement during resonant mode of operation is also presented. That, viscosity effects do not affect the working of this non-resonant mass sensor is clearly established by direct comparison. (C) 2014 AIP Publishing LLC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electromagnetic Articulography (EMA) technique is used to record the kinematics of different articulators while one speaks. EMA data often contains missing segments due to sensor failure. In this work, we propose a maximum a-posteriori (MAP) estimation with continuity constraint to recover the missing samples in the articulatory trajectories recorded using EMA. In this approach, we combine the benefits of statistical MAP estimation as well as the temporal continuity of the articulatory trajectories. Experiments on articulatory corpus using different missing segment durations show that the proposed continuity constraint results in a 30% reduction in average root mean squared error in estimation over statistical estimation of missing segments without any continuity constraint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We explore the effect of modification to Einstein's gravity in white dwarfs for the first time in the literature, to the best of our knowledge. This leads to significantly sub- and super-Chandrasekhar limiting masses of white dwarfs, determined by a single model parameter. On the other hand, type Ia supernovae (SNeIa), a key to unravel the evolutionary history of the universe, are believed to be triggered in white dwarfs having mass close to the Chandrasekhar limit. However, observations of several peculiar, under- and over-luminous SNeIa argue for exploding masses widely different from this limit. We argue that explosions of the modified gravity induced sub- and super-Chandrasekhar limiting mass white dwarfs result in under- and over-luminous SNeIa respectively, thus unifying these two apparently disjoint sub-classes and, hence, serving as a missing link. Our discovery raises two fundamental questions. Is the Chandrasekhar limit unique? Is Einstein's gravity the ultimate theory for understanding astronomical phenomena? Both the answers appear to be no!

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents two approximate analytical expressions for nonlinear electric fields in the principal direction in axially symmetric (3D) and two dimensional (2D) ion trap mass analysers with apertures (holes in case of 3D traps and slits in case of 2D traps) on the electrodes. Considered together (3D and 2D), we present composite approximations for the principal unidirectional nonlinear electric fields in these ion traps. The composite electric field E has the form E = E-noaperture + E-aperture. where E-noaperture is the field within an imagined trap which is identical to the practical trap except that the apertures are missing and E-aperture is the field contribution due to apertures on the two trap electrodes. The field along the principal axis, of the trap can in this way be well approximated for any aperture that is not too large. To derive E-aperture. classical results of electrostatics have been extended to electrodes with finite thickness and different aperture shapes.E-noaperture is a modified truncated multipole expansion for the imagined trap with no aperture. The first several terms in the multipole expansion are in principle exact(though numerically determined using the BEM), while the last term is chosen to match the field at the electrode. This expansion, once Computed, works with any aperture in the practical trap. The composite field approximation for axially symmetric (3D) traps is checked for three geometries: the Paul trap, the cylindrical ion trap (CIT) and an arbitrary other trap. The approximation for 2D traps is verified using two geometries: the linear ion trap (LIT) and the rectilinear ion trap (RIT). In each case, for two aperture sizes (10% and 50% of the trap dimension), highly satisfactory fits are obtained. These composite approximations may be used in more detailed nonlinear ion dynamics Studies than have been hitherto attempted. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A fuzzy logic system (FLS) with a new sliding window defuzzifier is proposed for structural damage detection using modal curvatures. Changes in the modal curvatures due to damage are fuzzified using Gaussian fuzzy sets and mapped to damage location and size using the FLS. The first four modal vectors obtained from finite element simulations of a cantilever beam are used for identifying the location and size of damage. Parametric studies show that modal curvatures can be used to accurately locate the damage; however, quantifying the size of damage is difficult. Tests with noisy simulated data show that the method detects damage very accurately at different noise levels and when some modal data are missing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uncertainty plays an important role in water quality management problems. The major sources of uncertainty in a water quality management problem are the random nature of hydrologic variables and imprecision (fuzziness) associated with goals of the dischargers and pollution control agencies (PCA). Many Waste Load Allocation (WLA)problems are solved by considering these two sources of uncertainty. Apart from randomness and fuzziness, missing data in the time series of a hydrologic variable may result in additional uncertainty due to partial ignorance. These uncertainties render the input parameters as imprecise parameters in water quality decision making. In this paper an Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) is developed for water quality management of a river system subject to uncertainty arising from partial ignorance. In a WLA problem, both randomness and imprecision can be addressed simultaneously by fuzzy risk of low water quality. A methodology is developed for the computation of imprecise fuzzy risk of low water quality, when the parameters are characterized by uncertainty due to partial ignorance. A Monte-Carlo simulation is performed to evaluate the imprecise fuzzy risk of low water quality by considering the input variables as imprecise. Fuzzy multiobjective optimization is used to formulate the multiobjective model. The model developed is based on a fuzzy multiobjective optimization problem with max-min as the operator. This usually does not result in a unique solution but gives multiple solutions. Two optimization models are developed to capture all the decision alternatives or multiple solutions. The objective of the two optimization models is to obtain a range of fractional removal levels for the dischargers, such that the resultant fuzzy risk will be within acceptable limits. Specification of a range for fractional removal levels enhances flexibility in decision making. The methodology is demonstrated with a case study of the Tunga-Bhadra river system in India.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discoveries at the LHC will soon set the physics agenda for future colliders. This report of a CERN Theory Institute includes the summaries of Working Groups that reviewed the physics goals and prospects of LHC running with 10 to 300 fb(-1) of integrated luminosity, of the proposed sLHC luminosity upgrade, of the ILC, of CLIC, of the LHeC and of a muon collider. The four Working Groups considered possible scenarios for the first 10 fb(-1) of data at the LHC in which (i) a state with properties that are compatible with a Higgs boson is discovered, (ii) no such state is discovered either because the Higgs properties are such that it is difficult to detect or because no Higgs boson exists, (iii) a missing-energy signal beyond the Standard Model is discovered as in some supersymmetric models, and (iv) some other exotic signature of new physics is discovered. In the contexts of these scenarios, the Working Groups reviewed the capabilities of the future colliders to study in more detail whatever new physics may be discovered by the LHC. Their reports provide the particle physics community with some tools for reviewing the scientific priorities for future colliders after the LHC produces its first harvest of new physics from multi-TeV collisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RECONNECT is a Network-on-Chip using a honeycomb topology. In this paper we focus on properties of general rules applicable to a variety of routing algorithms for the NoC which take into account the missing links of the honeycomb topology when compared to a mesh. We also extend the original proposal [5] and show a method to insert and extract data to and from the network. Access Routers at the boundary of the execution fabric establish connections to multiple periphery modules and create a torus to decrease the node distances. Our approach is scalable and ensures homogeneity among the compute elements in the NoC. We synthesized and evaluated the proposed enhancement in terms of power dissipation and area. Our results indicate that the impact of necessary alterations to the fabric is negligible and effects the data transfer between the fabric and the periphery only marginally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Trypanosoma evansi infections, commonly called 'surra', cause significant economic losses to livestock industry. While this infection is mainly restricted to large animals such as camels, donkeys and equines, recent reports indicate their ability to infect humans. There are no World Animal Health Organization (WAHO) prescribed diagnostic tests or vaccines available against this disease and the available drugs show significant toxicity. There is an urgent need to develop improved methods of diagnosis and control measures for this disease. Unlike its related human parasites T. brucei and T. cruzi whose genomes have been fully sequenced T. evansi genome sequence remains unavailable and very little efforts are being made to develop improved methods of prevention, diagnosis and treatment. With a view to identify potential diagnostic markers and drug targets we have studied the clinical proteome of T. evansi infection using mass spectrometry (MS).Methodology/Principal Findings: Using shot-gun proteomic approach involving nano-lc Quadrupole Time Of Flight (QTOF) mass spectrometry we have identified over 160 proteins expressed by T. evansi in mice infected with camel isolate. Homology driven searches for protein identification from MS/MS data led to most of the matches arising from related Trypanosoma species. Proteins identified belonged to various functional categories including metabolic enzymes; DNA metabolism; transcription; translation as well as cell-cell communication and signal transduction. TCA cycle enzymes were strikingly missing, possibly suggesting their low abundances. The clinical proteome revealed the presence of known and potential drug targets such as oligopeptidases, kinases, cysteine proteases and more.Conclusions/Significance: Previous proteomic studies on Trypanosomal infections, including human parasites T. brucei and T. cruzi, have been carried out from lab grown cultures. For T. evansi infection this is indeed the first ever proteomic study reported thus far. In addition to providing a glimpse into the biology of this neglected disease, our study is the first step towards identification of diagnostic biomarkers, novel drug targets as well as potential vaccine candidates to fight against T. evansi infections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims at evaluating the methods of multiclass support vector machines (SVMs) for effective use in distance relay coordination. Also, it describes a strategy of supportive systems to aid the conventional protection philosophy in combating situations where protection systems have maloperated and/or information is missing and provide selective and secure coordinations. SVMs have considerable potential as zone classifiers of distance relay coordination. This typically requires a multiclass SVM classifier to effectively analyze/build the underlying concept between reach of different zones and the apparent impedance trajectory during fault. Several methods have been proposed for multiclass classification where typically several binary SVM classifiers are combined together. Some authors have extended binary SVM classification to one-step single optimization operation considering all classes at once. In this paper, one-step multiclass classification, one-against-all, and one-against-one multiclass methods are compared for their performance with respect to accuracy, number of iterations, number of support vectors, training, and testing time. The performance analysis of these three methods is presented on three data sets belonging to training and testing patterns of three supportive systems for a region and part of a network, which is an equivalent 526-bus system of the practical Indian Western grid.