30 resultados para Geographic Regression Discontinuity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of local-polynomial modeling of smooth time-varying signals with unknown functional form, in the presence of additive noise. The problem formulation is in the time domain and the polynomial coefficients are estimated in the pointwise minimum mean square error (PMMSE) sense. The choice of the window length for local modeling introduces a bias-variance tradeoff, which we solve optimally by using the intersection-of-confidence-intervals (ICI) technique. The combination of the local polynomial model and the ICI technique gives rise to an adaptive signal model equipped with a time-varying PMMSE-optimal window length whose performance is superior to that obtained by using a fixed window length. We also evaluate the sensitivity of the ICI technique with respect to the confidence interval width. Simulation results on electrocardiogram (ECG) signals show that at 0dB signal-to-noise ratio (SNR), one can achieve about 12dB improvement in SNR. Monte-Carlo performance analysis shows that the performance is comparable to the basic wavelet techniques. For 0 dB SNR, the adaptive window technique yields about 2-3dB higher SNR than wavelet regression techniques and for SNRs greater than 12dB, the wavelet techniques yield about 2dB higher SNR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seismic hazard and microzonation of cities enable to characterize the potential seismic areas that need to be taken into account when designing new structures or retrofitting the existing ones. Study of seismic hazard and preparation of geotechnical microzonation maps has been attempted using Geographical Information System (GIS). GIS will provide an effective solution for integrating different layers of information thus providing a useful input for city planning and in particular input to earthquake resistant design of structures in an area. Seismic hazard is the study of expected earthquake ground motions at any point on the earth. Microzonation is the process of sub division of region in to number of zones based on the earthquake effects in the local scale. Seismic microzonation is the process of estimating response of soil layers under earthquake excitation and thus the variation of ground motion characteristic on the ground surface. For the seismic microzonation, geotechnical site characterization need to be assessed at local scale (micro level), which is further used to assess of the site response and liquefaction susceptibility of the sites. Seismotectonic atlas of the area having a radius of 350km around Bangalore has been prepared with all the seismogenic sources and historic earthquake events (a catalogue of about 1400 events since 1906). We have attempted to carryout the site characterization of Bangalore by collating conventional geotechnical boreholes data (about 900 borehole data with depth) and integrated in GIS. 3-D subsurface model of Bangalore prepared using GIS is shown in Figure 1.Further, Shear wave velocity survey based on geophysical method at about 60 locations in the city has been carried out in 220 square Kms area. Site response and local site effects have been evaluated using 1-dimensional ground response analysis. Spatial variability of soil overburden depths, ground surface Peak Ground Acceleration’s(PGA), spectral acceleration for different frequencies, liquefaction susceptibility have been mapped in the 220 sq km area using GIS.ArcInfo software has been used for this purpose. These maps can be used for the city planning and risk & vulnerability studies. Figure 2 shows a map of peak ground acceleration at rock level for Bangalore city. Microtremor experiments were jointly carried out with NGRI scientists at about 55 locations in the city and the predominant frequency of the overburden soil columns were evaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fully structured and matured open source spatial and temporal analysis technology seems to be the official carrier of the future for planning of the natural resources especially in the developing nations. This technology has gained enormous momentum because of technical superiority, affordability and ability to join expertise from all sections of the society. Sustainable development of a region depends on the integrated planning approaches adopted in decision making which requires timely and accurate spatial data. With the increased developmental programmes, the need for appropriate decision support system has increased in order to analyse and visualise the decisions associated with spatial and temporal aspects of natural resources. In this regard Geographic Information System (GIS) along with remote sensing data support the applications that involve spatial and temporal analysis on digital thematic maps and the remotely sensed images. Open source GIS would help in wide scale applications involving decisions at various hierarchical levels (for example from village panchayat to planning commission) on economic viability, social acceptance apart from technical feasibility. GRASS (Geographic Resources Analysis Support System, http://wgbis.ces.iisc.ernet.in/grass) is an open source GIS that works on Linux platform (freeware), but most of the applications are in command line argument, necessitating a user friendly and cost effective graphical user interface (GUI). Keeping these aspects in mind, Geographic Resources Decision Support System (GRDSS) has been developed with functionality such as raster, topological vector, image processing, statistical analysis, geographical analysis, graphics production, etc. This operates through a GUI developed in Tcltk (Tool command language / Tool kit) under Linux as well as with a shell in X-Windows. GRDSS include options such as Import /Export of different data formats, Display, Digital Image processing, Map editing, Raster Analysis, Vector Analysis, Point Analysis, Spatial Query, which are required for regional planning such as watershed Analysis, Landscape Analysis etc. This is customised to Indian context with an option to extract individual band from the IRS (Indian Remote Sensing Satellites) data, which is in BIL (Band Interleaved by Lines) format. The integration of PostgreSQL (a freeware) in GRDSS aids as an efficient database management system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a novel, scalable, clustering based Ordinal Regression formulation, which is an instance of a Second Order Cone Program (SOCP) with one Second Order Cone (SOC) constraint. The main contribution of the paper is a fast algorithm, CB-OR, which solves the proposed formulation more eficiently than general purpose solvers. Another main contribution of the paper is to pose the problem of focused crawling as a large scale Ordinal Regression problem and solve using the proposed CB-OR. Focused crawling is an efficient mechanism for discovering resources of interest on the web. Posing the problem of focused crawling as an Ordinal Regression problem avoids the need for a negative class and topic hierarchy, which are the main drawbacks of the existing focused crawling methods. Experiments on large synthetic and benchmark datasets show the scalability of CB-OR. Experiments also show that the proposed focused crawler outperforms the state-of-the-art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method of partial automation of specification based regression testing, which we call ESSE (Explicit State Space Enumeration). The first step in ESSE method is the extraction of a finite state model of the system making use of an already tested version of the system under test (SUT). Thereafter, the finite state model thus obtained is used to compute good test sequences that can be used to regression test subsequent versions of the system. We present two new algorithms for test sequence computation - both based on our finite state model generated by the above method. We also provide the details and results of the experimental evaluation of ESSE method. Comparison with a practically used random-testing algorithm has shown substantial improvements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a novel approach to solve the ordinal regression problem using Gaussian processes. The proposed approach, probabilistic least squares ordinal regression (PLSOR), obtains the probability distribution over ordinal labels using a particular likelihood function. It performs model selection (hyperparameter optimization) using the leave-one-out cross-validation (LOO-CV) technique. PLSOR has conceptual simplicity and ease of implementation of least squares approach. Unlike the existing Gaussian process ordinal regression (GPOR) approaches, PLSOR does not use any approximation techniques for inference. We compare the proposed approach with the state-of-the-art GPOR approaches on some synthetic and benchmark data sets. Experimental results show the competitiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a sparse modeling approach to solve ordinal regression problems using Gaussian processes (GP). Designing a sparse GP model is important from training time and inference time viewpoints. We first propose a variant of the Gaussian process ordinal regression (GPOR) approach, leave-one-out GPOR (LOO-GPOR). It performs model selection using the leave-one-out cross-validation (LOO-CV) technique. We then provide an approach to design a sparse model for GPOR. The sparse GPOR model reduces computational time and storage requirements. Further, it provides faster inference. We compare the proposed approaches with the state-of-the-art GPOR approach on some benchmark data sets. Experimental results show that the proposed approaches are competitive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple input multiple output (MIMO) systems with large number of antennas have been gaining wide attention as they enable very high throughputs. A major impediment is the complexity at the receiver needed to detect the transmitted data. To this end we propose a new receiver, called LRR (Linear Regression of MMSE Residual), which improves the MMSE receiver by learning a linear regression model for the error of the MMSE receiver. The LRR receiver uses pilot data to estimate the channel, and then uses locally generated training data (not transmitted over the channel), to find the linear regression parameters. The proposed receiver is suitable for applications where the channel remains constant for a long period (slow-fading channels) and performs quite well: at a bit error rate (BER) of 10(-3), the SNR gain over MMSE receiver is about 7 dB for a 16 x 16 system; for a 64 x 64 system the gain is about 8.5 dB. For large coherence time, the complexity order of the LRR receiver is the same as that of the MMSE receiver, and in simulations we find that it needs about 4 times as many floating point operations. We also show that further gain of about 4 dB is obtained by local search around the estimate given by the LRR receiver.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important question in kernel regression is one of estimating the order and bandwidth parameters from available noisy data. We propose to solve the problem within a risk estimation framework. Considering an independent and identically distributed (i.i.d.) Gaussian observations model, we use Stein's unbiased risk estimator (SURE) to estimate a weighted mean-square error (MSE) risk, and optimize it with respect to the order and bandwidth parameters. The two parameters are thus spatially adapted in such a manner that noise smoothing and fine structure preservation are simultaneously achieved. On the application side, we consider the problem of image restoration from uniform/non-uniform data, and show that the SURE approach to spatially adaptive kernel regression results in better quality estimation compared with its spatially non-adaptive counterparts. The denoising results obtained are comparable to those obtained using other state-of-the-art techniques, and in some scenarios, superior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Elastic Net Regularizers have shown much promise in designing sparse classifiers for linear classification. In this work, we propose an alternating optimization approach to solve the dual problems of elastic net regularized linear classification Support Vector Machines (SVMs) and logistic regression (LR). One of the sub-problems turns out to be a simple projection. The other sub-problem can be solved using dual coordinate descent methods developed for non-sparse L2-regularized linear SVMs and LR, without altering their iteration complexity and convergence properties. Experiments on very large datasets indicate that the proposed dual coordinate descent - projection (DCD-P) methods are fast and achieve comparable generalization performance after the first pass through the data, with extremely sparse models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a realistic nonlinear mathematical model for melanoma dynamics and the technique of optimal dynamic inversion (exact feedback linearization with static optimization), a multimodal automatic drug dosage strategy is proposed in this paper for complete regression of melanoma cancer in humans. The proposed strategy computes different drug dosages and gives a nonlinear state feedback solution for driving the number of cancer cells to zero. However, it is observed that when tumor is regressed to certain value, then there is no need of external drug dosages as immune system and other therapeutic states are able to regress tumor at a sufficiently fast rate which is more than exponential rate. As model has three different drug dosages, after applying dynamic inversion philosophy, drug dosages can be selected in optimized manner without crossing their toxicity limits. The combination of drug dosages is decided by appropriately selecting the control design parameter values based on physical constraints. The process is automated for all possible combinations of the chemotherapy and immunotherapy drug dosages with preferential emphasis of having maximum possible variety of drug inputs at any given point of time. Simulation study with a standard patient model shows that tumor cells are regressed from 2 x 107 to order of 105 cells because of external drug dosages in 36.93 days. After this no external drug dosages are required as immune system and other therapeutic states are able to regress tumor at greater than exponential rate and hence, tumor goes to zero (less than 0.01) in 48.77 days and healthy immune system of the patient is restored. Study with different chemotherapy drug resistance value is also carried out. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work presents the results of experimental investigation of semi-solid rheocasting of A356 Al alloy using a cooling slope. The experiments have been carried out following Taguchi method of parameter design (orthogonal array of L-9 experiments). Four key process variables (slope angle, pouring temperature, wall temperature, and length of travel of the melt) at three different levels have been considered for the present experimentation. Regression analysis and analysis of variance (ANOVA) has also been performed to develop a mathematical model for degree of sphericity evolution of primary alpha-Al phase and to find the significance and percentage contribution of each process variable towards the final outcome of degree of sphericity, respectively. The best processing condition has been identified for optimum degree of sphericity (0.83) as A(3), B-3, C-2, D-1 i.e., slope angle of 60 degrees, pouring temperature of 650 degrees C, wall temperature 60 degrees C, and 500 mm length of travel of the melt, based on mean response and signal to noise ratio (SNR). ANOVA results shows that the length of travel has maximum impact on degree of sphericity evolution. The predicted sphericity obtained from the developed regression model and the values obtained experimentally are found to be in good agreement with each other. The sphericity values obtained from confirmation experiment, performed at 95% confidence level, ensures that the optimum result is correct and also the confirmation experiment values are within permissible limits. (c) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a novel algorithm for piecewise linear regression which can learn continuous as well as discontinuous piecewise linear functions. The main idea is to repeatedly partition the data and learn a linear model in each partition. The proposed algorithm is similar in spirit to k-means clustering algorithm. We show that our algorithm can also be viewed as a special case of an EM algorithm for maximum likelihood estimation under a reasonable probability model. We empirically demonstrate the effectiveness of our approach by comparing its performance with that of the state of art algorithms on various datasets. (C) 2014 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change in response to a change in external forcing can be understood in terms of fast response to the imposed forcing and slow feedback associated with surface temperature change. Previous studies have investigated the characteristics of fast response and slow feedback for different forcing agents. Here we examine to what extent that fast response and slow feedback derived from time-mean results of climate model simulations can be used to infer total climate change. To achieve this goal, we develop a multivariate regression model of climate change, in which the change in a climate variable is represented by a linear combination of its sensitivity to CO2 forcing, solar forcing, and change in global mean surface temperature. We derive the parameters of the regression model using time-mean results from a set of HadCM3L climate model step-forcing simulations, and then use the regression model to emulate HadCM3L-simulated transient climate change. Our results show that the regression model emulates well HadCM3L-simulated temporal evolution and spatial distribution of climate change, including surface temperature, precipitation, runoff, soil moisture, cloudiness, and radiative fluxes under transient CO2 and/or solar forcing scenarios. Our findings suggest that temporal and spatial patterns of total change for the climate variables considered here can be represented well by the sum of fast response and slow feedback. Furthermore, by using a simple 1-D heat-diffusion climate model, we show that the temporal and spatial characteristics of climate change under transient forcing scenarios can be emulated well using information from step-forcing simulations alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Naturally occurring compounds are considered as attractive candidates for cancer treatment and prevention. Quercetin and ellagic acid are naturally occurring flavonoids abundantly seen in several fruits and vegetables. In the present study, we evaluate and compare antitumor efficacies of quercetin and ellagic acid in animal models and cancer cell lines in a comprehensive manner. We found that quercetin induced cytotoxicity in leukemic cells in a dose-dependent manner, while ellagic acid showed only limited toxicity. Besides leukemic cells, quercetin also induced cytotoxicity in breast cancer cells, however, its effect on normal cells was limited or none. Further, quercetin caused S phase arrest during cell cycle progression in tested cancer cells. Quercetin induced tumor regression in mice at a concentration 3-fold lower than ellagic acid. Importantly, administration of quercetin lead to -5 fold increase in the life span in tumor bearing mice compared to that of untreated controls. Further, we found that quercetin interacts with DNA directly, and could be one of the mechanisms for inducing apoptosis in both, cancer cell lines and tumor tissues by activating the intrinsic pathway. Thus, our data suggests that quercetin can be further explored for its potential to be used in cancer therapeutics and combination therapy.