63 resultados para non-parametric technique


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Magnetoencephalography (MEG), a non-invasive technique for characterizing brain electrical activity, is gaining popularity as a tool for assessing group-level differences between experimental conditions. One method for assessing task-condition effects involves beamforming, where a weighted sum of field measurements is used to tune activity on a voxel-by-voxel basis. However, this method has been shown to produce inhomogeneous smoothness differences as a function of signal-to-noise across a volumetric image, which can then produce false positives at the group level. Here we describe a novel method for group-level analysis with MEG beamformer images that utilizes the peak locations within each participant's volumetric image to assess group-level effects. We compared our peak-clustering algorithm with SnPM using simulated data. We found that our method was immune to artefactual group effects that can arise as a result of inhomogeneous smoothness differences across a volumetric image. We also used our peak-clustering algorithm on experimental data and found that regions were identified that corresponded with task-related regions identified in the literature. These findings suggest that our technique is a robust method for group-level analysis with MEG beamformer images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the role of absorptive capacity in the diffusion of global technology with sector and firm heterogeneity. We construct the FDI-intensity weighted global R&D stock for each industry and link it to Chinese firm-level panel data relating to 53,981 firms over the period 2001-2005. Non-parametric frontier analysis is employed to explore how absorptive capacity affects technical change and catch-up in the presence of global knowledge spillovers. We find that R&D activities and training at individual firms serve as an effective source of absorptive capability. The contribution of absorptive capacity varies according to the type of FDI and the extent of openness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper analyzes the performance of Dutch drinking water utilities before and after the introduction of sunshine regulation, which involves publication of the performance of utilities but no formal price regulation. By decomposing profit change into its economic drivers, our results suggest that, in the Dutch political and institutional context, sunshine regulation was effective in improving the productivity of publicly organised services. Nevertheless, while sunshine regulation did bring about a moderate reduction in water prices, sustained and substantial economic profits suggest that it may not have the potential to fully align output prices with economic costs in the long run. In methodological terms, the DEA based profit decomposition is extended to robust and conditional non-parametric efficiency measures, so as to account better for both uncertainty and differences in operating environment between utilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines the selectivity and timing performance of 218 UK investment trusts over the period July 1981 to June 2009. We estimate the Treynor and Mazuy (1966) and Henriksson and Merton (1981) models augmented with the size, value, and momentum factors, either under the OLS method adjusted with the Newey-West procedure or under the GARCH(1,1)-in-mean method following the specification of Glosten et al. (1993; hereafter GJR-GARCH-M). We find that the OLS method provides little evidence in favour of the selectivity and timing ability, consistent with previous studies. Interestingly, the GJR-GARCH-M method reverses this result, showing some relatively strong evidence on favourable selectivity ability, particularly for international funds, as well as favourable timing ability, particularly for domestic funds. We conclude that the GJR-GARCH-M method performs better in evaluating fund performance compared with the OLS method and the non-parametric approach, as it essentially accounts for the time-varying characteristics of factor loadings and hence obtains more reliable results, in particular, when the high frequency data, such as the daily returns, are used in the analysis. Our results are robust to various in-sample and out-of-sample tests and have valuable implications for practitioners in making their asset allocation decisions across different fund styles. © 2012 Elsevier B.V.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Kolmogorov-Smirnov (KS) test is a non-parametric test which can be used in two different circumstances. First, it can be used as an alternative to chi-square (?2) as a ‘goodness-of-fit’ test to compare whether a given ‘observed’ sample of observations conforms to an ‘expected’ distribution of results (KS, one-sample test). An example of the use of the one-sample test to determine whether a sample of observations was normally distributed was described previously. Second, it can be used as an alternative to the Mann-Whitney test to compare two independent samples of observations (KS, two-sample test). Hence, this statnote describes the use of the KS test with reference to two scenarios: (1) to compare the observed frequency (Fo) of soil samples containing cysts of the protozoan Naegleria collected each month for a year with an expected equal frequency (Fe) across months (one-sample test), and (2) to compare the abundance of bacteria on cloths and sponges sampled in a domestic kitchen environment (two-sample test).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines the problems in the definition of the General Non-Parametric Corporate Performance (GNCP) and introduces a multiplicative linear programming as an alternative model for corporate performance. We verified and tested a statistically significant difference between the two models based on the application of 27 UK industries using six performance ratios. Our new model is found to be a more robust performance model than the previous standard Data Envelopment Analysis (DEA) model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use non-parametric procedures to identify breaks in the underlying series of UK household sector money demand functions. Money demand functions are estimated using cointegration techniques and by employing both the Simple Sum and Divisia measures of money. P-star models are also estimated for out-of-sample inflation forecasting. Our findings suggest that the presence of breaks affects both the estimation of cointegrated money demand functions and the inflation forecasts. P-star forecast models based on Divisia measures appear more accurate at longer horizons and the majority of models with fundamentals perform better than a random walk model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The sign test is a simple non-parametric test which can be used on paired data, i.e., two related samples, matched samples, or repeated measurements on the same sample. It was developed by Wilcoxon before the more powerful and familiar ‘Wilcoxon signed-rank test’ described in a previous statnote . This statnote describes the use of the sign test with reference to two scenarios: (1) to compare the cleanliness of two hospital wards as assessed by a sample of observers and (2) to compare bacterial contamination on cloths and sponges from a domestic kitchen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The predictive accuracy of competing crude-oil price forecast densities is investigated for the 1994–2006 period. Moving beyond standard ARCH type models that rely exclusively on past returns, we examine the benefits of utilizing the forward-looking information that is embedded in the prices of derivative contracts. Risk-neutral densities, obtained from panels of crude-oil option prices, are adjusted to reflect real-world risks using either a parametric or a non-parametric calibration approach. The relative performance of the models is evaluated for the entire support of the density, as well as for regions and intervals that are of special interest for the economic agent. We find that non-parametric adjustments of risk-neutral density forecasts perform significantly better than their parametric counterparts. Goodness-of-fit tests and out-of-sample likelihood comparisons favor forecast densities obtained by option prices and non-parametric calibration methods over those constructed using historical returns and simulated ARCH processes. © 2010 Wiley Periodicals, Inc. Jrl Fut Mark 31:727–754, 2011

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper estimates the importance of (tariff-mediated) network effects and the impact of a consumer's social network on her choice of mobile phone provider. The study uses network data obtained from surveys of students in several European and Asian countries. We use the Quadratic Assignment Procedure, a non-parametric permutation test, to adjust for the particular error structure of network data. We find that respondents strongly coordinate their choice of mobile phone providers, but only if their provider induces network effects. This suggests that this coordination depends on network effects rather than on information contagion or pressure to conform to the social environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lyophilisation or freeze drying is the preferred dehydrating method for pharmaceuticals liable to thermal degradation. Most biologics are unstable in aqueous solution and may use freeze drying to prolong their shelf life. Lyophilisation is however expensive and has seen lots of work aimed at reducing cost. This thesis is motivated by the potential cost savings foreseen with the adoption of a cost efficient bulk drying approach for large and small molecules. Initial studies identified ideal formulations that adapted well to bulk drying and further powder handling requirements downstream in production. Low cost techniques were used to disrupt large dried cakes into powder while the effects of carrier agent concentration were investigated for powder flowability using standard pharmacopoeia methods. This revealed superiority of crystalline mannitol over amorphous sucrose matrices and established that the cohesive and very poor flow nature of freeze dried powders were potential barriers to success. Studies from powder characterisation showed increased powder densification was mainly responsible for significant improvements in flow behaviour and an initial bulking agent concentration of 10-15 %w/v was recommended. Further optimisation studies evaluated the effects of freezing rates and thermal treatment on powder flow behaviour. Slow cooling (0.2 °C/min) with a -25°C annealing hold (2hrs) provided adequate mechanical strength and densification at 0.5-1 M mannitol concentrations. Stable bulk powders require powder transfer into either final vials or intermediate storage closures. The targeted dosing of powder formulations using volumetric and gravimetric powder dispensing systems where evaluated using Immunoglobulin G (IgG), Lactate Dehydrogenase (LDH) and Beta Galactosidase models. Final protein content uniformity in dosed vials was assessed using activity and protein recovery assays to draw conclusions from deviations and pharmacopeia acceptance values. A correlation between very poor flowability (p<0.05), solute concentration, dosing time and accuracy was revealed. LDH and IgG lyophilised in 0.5 M and 1 M mannitol passed Pharmacopeia acceptance values criteria with 0.1-4 while formulations with micro collapse showed the best dose accuracy (0.32-0.4% deviation). Bulk mannitol content above 0.5 M provided no additional benefits to dosing accuracy or content uniformity of dosed units. This study identified considerations which included the type of protein, annealing, cake disruption process, physical form of the phases present, humidity control and recommended gravimetric transfer as optimal for dispensing powder. Dosing lyophilised powders from bulk was demonstrated as practical, time efficient, economical and met regulatory requirements in cases. Finally the use of a new non-destructive technique, X-ray microcomputer tomography (MCT), was explored for cake and particle characterisation. Studies demonstrated good correlation with traditional gas porosimetry (R2 = 0.93) and morphology studies using microscopy. Flow characterisation from sample sizes of less than 1 mL was demonstrated using three dimensional X-ray quantitative image analyses. A platinum-mannitol dispersion model used revealed a relationship between freezing rate, ice nucleation sites and variations in homogeneity within the top to bottom segments of a formulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cochran's Q-test is a non-parametric analysis which can be applied to a two-way design in which the data are binomial and can take only two possible outcomes, e.g., 0 or 1, alive or dead, present or absent, clean or dirty, infected or non-infected, and is an extension to the binomial tests introduced in Statnote 39 . This statnote describes the application of this test in the analysis of the changes which occur in the fungal flora of forestry nursery beds after two different sterilization procedures .