987 resultados para Statistical efficiency


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The simple efficiency model is developed on scale-free networks with communities to study the effect of the communities in complex networks on efficiency dynamics. For some parameters, we found that the state of system will transit from a stagnant phase to a growing phase as the strength of community decreases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of the process variables, pH of aqueous phase, rate of addition of organic, polymeric, drug-containing phase to aqueous phase, organic:aqueous phase volume ratio and aqueous phase temperature on the entrapment of propranolol hydrochloride in ethylcellulose (N4) microspheres prepared by the solvent evaporation method were examined using a factorial design. The observed range of drug entrapment was 1.43 +/- 0.02%w/w (pH 6, 25 degrees C, phase volume ratio 1:10, fast rate of addition) to 16.63 +/- 0.92%w/w (pH 9, 33 degrees C, phase volume ratio 1:10, slow rate of addition) which corresponded to mean entrapment efficiencies of 2.86 and 33.26, respectively. Increased pH, increased temperature and decreased rate of addition significantly enhanced entrapment efficiency. However, organic:aqueous phase volume ratio did not significantly affect drug entrapment. Statistical interactions were observed between pH and rate of addition, pH and temperature, and temperature and rate of addition. The observed interactions involving pH are suggested to be due to the abilities of increased temperature and slow rate of addition to sufficiently enhance the solubility of dichloromethane in the aqueous phase, which at pH 9, but not pH 6, allows partial polymer precipitation prior to drug partitioning into the aqueous phase. The interaction between temperature and rate of addition is due to the relative lack of effect of increased temperature on drug entrapment following slow rate of addition of the organic phase. In comparison to the effects of pH on drug entrapment, the contributions of the other physical factors examined were limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive radio has been proposed as a means of improving the spectrum utilisation and increasing spectrum efficiency of wireless systems. This can be achieved by allowing cognitive radio terminals to monitor their spectral environment and opportunistically access the unoccupied frequency channels. Due to the opportunistic nature of cognitive radio, the overall performance of such networks depends on the spectrum occupancy or availability patterns. Appropriate knowledge on channel availability can optimise the sensing performance in terms of spectrum and energy efficiency. This work proposes a statistical framework for the channel availability in the polarization domain. A Gaussian Normal approximation is used to model real-world occupancy data obtained through a measurement campaign in the cellular frequency bands within a realistic scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this thesis is to estimate the effect of the form of knowledge representation on the efficiency of knowledge sharing. The objectives include the design of an experimental framework which would allow to establish this effect, data collection, and statistical analysis of the collected data. The study follows the experimental quantitative design. The experimental questionnaire features three sample forms of knowledge: text, mind maps, concept maps. In the interview, these forms are presented to an interviewee, afterwards the knowledge sharing time and knowledge sharing quality are measured. According to the statistical analysis of 76 interviews, text performs worse in both knowledge sharing time and quality compared to visualized forms of knowledge representation. However, mind maps and concept maps do not differ in knowledge sharing time and quality, since this difference is not statistically significant. Since visualized structured forms of knowledge perform better than unstructured text in knowledge sharing, it is advised for companies to foster the usage of these forms in knowledge sharing processes inside the company. Aside of performance in knowledge sharing, the visualized structured forms are preferable due the possibility of their usage in the system of ontological knowledge management within an enterprise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the analysis of productivity, capacity utilisation and profitability the data relating to the manufacturing central public sector enterprises in Kerala have been collected from the published annual reports of the companies, public enterprises surveys of Bureau of Public Enterprises (BPE), Economic Review of State Planning Board (SPB) and statistical review of central government enterprises by Centre for Monitoring Indian Economy (CMIE). Primary data have been collected by conducting personal interview with the high and middle level executives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-transcriptional gene silencing by RNA interference is mediated by small interfering RNA called siRNA. This gene silencing mechanism can be exploited therapeutically to a wide variety of disease-associated targets, especially in AIDS, neurodegenerative diseases, cholesterol and cancer on mice with the hope of extending these approaches to treat humans. Over the recent past, a significant amount of work has been undertaken to understand the gene silencing mediated by exogenous siRNA. The design of efficient exogenous siRNA sequences is challenging because of many issues related to siRNA. While designing efficient siRNA, target mRNAs must be selected such that their corresponding siRNAs are likely to be efficient against that target and unlikely to accidentally silence other transcripts due to sequence similarity. So before doing gene silencing by siRNAs, it is essential to analyze their off-target effects in addition to their inhibition efficiency against a particular target. Hence designing exogenous siRNA with good knock-down efficiency and target specificity is an area of concern to be addressed. Some methods have been developed already by considering both inhibition efficiency and off-target possibility of siRNA against agene. Out of these methods, only a few have achieved good inhibition efficiency, specificity and sensitivity. The main focus of this thesis is to develop computational methods to optimize the efficiency of siRNA in terms of “inhibition capacity and off-target possibility” against target mRNAs with improved efficacy, which may be useful in the area of gene silencing and drug design for tumor development. This study aims to investigate the currently available siRNA prediction approaches and to devise a better computational approach to tackle the problem of siRNA efficacy by inhibition capacity and off-target possibility. The strength and limitations of the available approaches are investigated and taken into consideration for making improved solution. Thus the approaches proposed in this study extend some of the good scoring previous state of the art techniques by incorporating machine learning and statistical approaches and thermodynamic features like whole stacking energy to improve the prediction accuracy, inhibition efficiency, sensitivity and specificity. Here, we propose one Support Vector Machine (SVM) model, and two Artificial Neural Network (ANN) models for siRNA efficiency prediction. In SVM model, the classification property is used to classify whether the siRNA is efficient or inefficient in silencing a target gene. The first ANNmodel, named siRNA Designer, is used for optimizing the inhibition efficiency of siRNA against target genes. The second ANN model, named Optimized siRNA Designer, OpsiD, produces efficient siRNAs with high inhibition efficiency to degrade target genes with improved sensitivity-specificity, and identifies the off-target knockdown possibility of siRNA against non-target genes. The models are trained and tested against a large data set of siRNA sequences. The validations are conducted using Pearson Correlation Coefficient, Mathews Correlation Coefficient, Receiver Operating Characteristic analysis, Accuracy of prediction, Sensitivity and Specificity. It is found that the approach, OpsiD, is capable of predicting the inhibition capacity of siRNA against a target mRNA with improved results over the state of the art techniques. Also we are able to understand the influence of whole stacking energy on efficiency of siRNA. The model is further improved by including the ability to identify the “off-target possibility” of predicted siRNA on non-target genes. Thus the proposed model, OpsiD, can predict optimized siRNA by considering both “inhibition efficiency on target genes and off-target possibility on non-target genes”, with improved inhibition efficiency, specificity and sensitivity. Since we have taken efforts to optimize the siRNA efficacy in terms of “inhibition efficiency and offtarget possibility”, we hope that the risk of “off-target effect” while doing gene silencing in various bioinformatics fields can be overcome to a great extent. These findings may provide new insights into cancer diagnosis, prognosis and therapy by gene silencing. The approach may be found useful for designing exogenous siRNA for therapeutic applications and gene silencing techniques in different areas of bioinformatics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the role of natural resource windfalls in explaining the efficiency of public expenditures. Using a rich dataset of expenditures and public good provision for 1,836 municipalities in Peru for period 2001-2010, we estimate a non-monotonic relationship between the efficiency of public good provision and the level of natural resource transfers. Local governments that were extremely favored by the boom of mineral prices were more efficient in using fiscal windfalls whereas those benefited with modest transfers were more inefficient. These results can be explained by the increase in political competition associated with the boom. However, the fact that increases in efficiency were related to reductions in public good provision casts doubts about the beneficial effects of political competition in promoting efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A synbiotic is a formulation containing both probiotics and prebiotics. This study aims to evaluate the effect of supplementation with a synbiotic containing Enterococcus faecium strain E1707 (NCIMB 10415) in preventing or controlling diarrhoea and other gastrointestinal signs in boarded canine radiotherapy patients. A double-blind, randomized, placebocontrolled clinical trial was carried out in 21 adult dogs undergoing radiotherapy and boarded for a duration period of 2 to 3 weeks to treat their cancers. Dogs were randomly divided between two groups: A and B, the synbiotic and placebo group, respectively. The content of the sachets was added to the food once daily. Faecal score was assessed daily, and dogs were also monitored for the development of diarrhoea and other gastrointestinal signs such as weight loss, reduced appetite and vomiting. The results from descriptive statistics seem to favour group B, however these findings were not validated with inferential statistics due to insufficient statistical sample power. Because of this, it is not possible to make conclusions about the benefits of synbiotic as supportive treatment for dogs undergoing radiotherapy. All results should be considered to be preliminary, until they are elucidated by further animal inclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technical efficiency is estimated and examined for a cross-section of Australian dairy farms using various frontier methodologies; Bayesian and Classical stochastic frontiers, and Data Envelopment Analysis. The results indicate technical inefficiency is present in the sample data. Also identified are statistical differences between the point estimates of technical efficiency generated by the various methodologies. However, the rank of farm level technical efficiency is statistically invariant to the estimation technique employed. Finally, when confidence/credible intervals of technical efficiency are compared significant overlap is found for many of the farms' intervals for all frontier methods employed. The results indicate that the choice of estimation methodology may matter, but the explanatory power of all frontier methods is significantly weaker when interval estimate of technical efficiency is examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aimed to identify key parameters influencing N utilization and develop prediction equations for manure N output (MN), feces N output (FN), and urine N output (UN). Data were obtained under a series of digestibility trials with nonpregnant dry cows fed fresh grass at maintenance level. Grass was cut from 8 different ryegrass swards measured from early to late maturity in 2007 and 2008 (2 primary growth, 3 first regrowth, and 3 second regrowth) and from 2 primary growth early maturity swards in 2009. Each grass was offered to a group of 4 cows and 2 groups were used in each of the 8 swards in 2007 and 2008 for daily measurements over 6 wk; the first group (first 3 wk) and the second group (last 3 wk) assessed early and late maturity grass, respectively. Average values of continuous 3-d data of N intake (NI) and output for individual cows ( = 464) and grass nutrient contents ( = 116) were used in the statistical analysis. Grass N content was positively related to GE and ME contents but negatively related to grass water-soluble carbohydrates (WSC), NDF, and ADF contents ( < 0.01), indicating that accounting for nutrient interrelations is a crucial aspect of N mitigation. Significantly greater ratios of UN:FN, UN:MN, and UN:NI were found with increased grass WSC contents and ratios of N:WSC, N:digestible OM in total DM (DOMD), and N:ME ( < 0.01). Greater NI, animal BW, and grass N contents and lower grass WSC, NDF, ADF, DOMD, and ME concentrations were significantly associated with greater MN, FN, and UN ( < 0.05). The present study highlighted that using grass lower in N and greater in fermentable energy in animals fed solely fresh grass at maintenance level can improve N utilization, reduce N outputs, and shift part of N excretion toward feces rather than urine. These outcomes are highly desirable in mitigation strategies to reduce nitrous oxide emissions from livestock. Equations predicting N output from BW and grass N content explained a similar amount of variability as using NI and grass chemical composition (excluding DOMD and ME), implying that parameters easily measurable in practice could be used for estimating N outputs. In a research environment, where grass DOMD and ME are likely to be available, their use to predict N outputs is highly recommended because they strongly improved of the equations in the current study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the top ten most influential data mining algorithms, k-means, is known for being simple and scalable. However, it is sensitive to initialization of prototypes and requires that the number of clusters be specified in advance. This paper shows that evolutionary techniques conceived to guide the application of k-means can be more computationally efficient than systematic (i.e., repetitive) approaches that try to get around the above-mentioned drawbacks by repeatedly running the algorithm from different configurations for the number of clusters and initial positions of prototypes. To do so, a modified version of a (k-means based) fast evolutionary algorithm for clustering is employed. Theoretical complexity analyses for the systematic and evolutionary algorithms under interest are provided. Computational experiments and statistical analyses of the results are presented for artificial and text mining data sets. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper tackles the problem of showing that evolutionary algorithms for fuzzy clustering can be more efficient than systematic (i.e. repetitive) approaches when the number of clusters in a data set is unknown. To do so, a fuzzy version of an Evolutionary Algorithm for Clustering (EAC) is introduced. A fuzzy cluster validity criterion and a fuzzy local search algorithm are used instead of their hard counterparts employed by EAC. Theoretical complexity analyses for both the systematic and evolutionary algorithms under interest are provided. Examples with computational experiments and statistical analyses are also presented.