40 resultados para Uniformly
Resumo:
The emergence of uncorrelated growing networks is proved when nodes are removed either uniformly or under the preferential survival rule recently observed in the World Wide Web evolution. To this aim, the rate equation for the joint probability of degrees is derived, and stationary symmetrical solutions are obtained, by passing to the continuum limit. When a uniformly random removal of extant nodes and linear preferential attachment of new nodes are at work, we prove that the only stationary solution corresponds to uncorrelated networks for any removal rate r ∈ (0,1). In the more general case of preferential survival of nodes, uncorrelated solutions are also obtained. These results generalize the uncorrelatedness displayed by the (undirected) Barab´asi-Albert network model to models with uniformly random and selective (against low degrees) removal of nodes
Resumo:
Introducción y objetivos. Se ha señalado que, en la miocardiopatía hipertrófica (MCH), la desorganización de las fibras regionales da lugar a segmentos en los que la deformación es nula o está gravemente reducida, y que estos segmentos tienen una distribución no uniforme en el ventrículo izquierdo (VI). Esto contrasta con lo observado en otros tipos de hipertrofia como en el corazón de atleta o la hipertrofia ventricular izquierda hipertensiva (HVI-HT), en los que puede haber una deformación cardiaca anormal, pero nunca tan reducida como para que se observe ausencia de deformación. Así pues, proponemos el empleo de la distribución de los valores de strain para estudiar la deformación en la MCH. Métodos. Con el empleo de resonancia magnética marcada (tagged), reconstruimos la deformación sistólica del VI de 12 sujetos de control, 10 atletas, 12 pacientes con MCH y 10 pacientes con HVI-HT. La deformación se cuantificó con un algoritmo de registro no rígido y determinando los valores de strain sistólico máximo radial y circunferencial en 16 segmentos del VI. Resultados. Los pacientes con MCH presentaron unos valores medios de strain significativamente inferiores a los de los demás grupos. Sin embargo, aunque la deformación observada en los individuos sanos y en los pacientes con HVI-HT se concentraba alrededor del valor medio, en la MCH coexistían segmentos con contracción normal y segmentos con una deformación nula o significativamente reducida, con lo que se producía una mayor heterogeneidad de los valores de strain. Se observaron también algunos segmentos sin deformación incluso en ausencia de fibrosis o hipertrofia. Conclusiones. La distribución de strain caracteriza los patrones específicos de deformación miocárdica en pacientes con diferentes etiologías de la HVI. Los pacientes con MCH presentaron un valor medio de strain significativamente inferior, así como una mayor heterogeneidad de strain (en comparación con los controles, los atletas y los pacientes con HVI-HT), y tenían regiones sin deformación.
Resumo:
The aim of this paper is to test formally the classical business cycle hypothesis, using data from industrialized countries for the time period since 1960. The hypothesis is characterized by the view that the cyclical structure in GDP is concentrated in the investment series: fixed investment has typically a long cycle, while the cycle in inventory investment is shorter. To check the robustness of our results, we subject the data for 15 OECD countries to a variety of detrending techniques. While the hypothesis is not confirmed uniformly for all countries, there is a considerably high number for which the data display the predicted pattern. None of the countries shows a pattern which can be interpreted as a clear rejection of the classical hypothesis.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
The aim of this paper is to test formally the classical business cyclehypothesis, using data from industrialized countries for the timeperiod since 1960. The hypothesis is characterized by the view that the cyclical structure in GDP is concentrated in the investment series: fixed investment has typically a long cycle, while the cycle in inventory investment is shorter. To check the robustness of our results, we subject the data for 15 OECD countries to a variety of detrending techniques. While the hypothesis is not confirmed uniformly for all countries, there is a considerably high number for which the data display the predicted pattern. None of the countries shows a pattern which can be interpreted as a clear rejection of the classical hypothesis.
Resumo:
This paper describes an optimized model to support QoS by mean of Congestion minimization on LSPs (Label Switching Path). In order to perform this model, we start from a CFA (Capacity and Flow Allocation) model. As this model does not consider the buffer size to calculate the capacity cost, our model- named BCA (Buffer Capacity Allocation)- take into account this issue and it improve the CFA performance. To test our proposal, we perform several simulations; results show that BCA model minimizes LSP congestion and uniformly distributes flows on the network
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600¿1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.
Resumo:
We consider an infinite number of noninteracting lattice random walkers with the goal of determining statistical properties of the time, out of a total time T, that a single site has been occupied by n random walkers. Initially the random walkers are assumed uniformly distributed on the lattice except for the target site at the origin, which is unoccupied. The random-walk model is taken to be a continuous-time random walk and the pausing-time density at the target site is allowed to differ from the pausing-time density at other sites. We calculate the dependence of the mean time of occupancy by n random walkers as a function of n and the observation time T. We also find the variance for the cumulative time during which the site is unoccupied. The large-T behavior of the variance differs according as the random walk is transient or recurrent. It is shown that the variance is proportional to T at large T in three or more dimensions, it is proportional to T3/2 in one dimension and to TlnT in two dimensions.
Resumo:
A general dynamical model for the first-order optical Fréedericksz transition incorporating spatial transverse inhomogeneities and hydrodynamic effects is discussed in the framework of a time-dependent Ginzburg-Landau model. The motion of an interface between two coexisting states with different director orientations is considered. A uniformly translating front solution of the dynamical equations for the motion of that interface is described.
Differences in the evolutionary history of disease genes affected by dominant or recessive mutations
Resumo:
Background: Global analyses of human disease genes by computational methods have yielded important advances in the understanding of human diseases. Generally these studies have treated the group of disease genes uniformly, thus ignoring the type of disease-causing mutations (dominant or recessive). In this report we present a comprehensive study of the evolutionary history of autosomal disease genes separated by mode of inheritance.Results: We examine differences in protein and coding sequence conservation between dominant and recessive human disease genes. Our analysis shows that disease genes affected by dominant mutations are more conserved than those affected by recessive mutations. This could be a consequence of the fact that recessive mutations remain hidden from selection while heterozygous. Furthermore, we employ functional annotation analysis and investigations into disease severity to support this hypothesis. Conclusion: This study elucidates important differences between dominantly- and recessively-acting disease genes in terms of protein and DNA sequence conservation, paralogy and essentiality. We propose that the division of disease genes by mode of inheritance will enhance both understanding of the disease process and prediction of candidate disease genes in the future.
Resumo:
The restricted maximum likelihood is preferred by many to the full maximumlikelihood for estimation with variance component and other randomcoefficientmodels, because the variance estimator is unbiased. It is shown that thisunbiasednessis accompanied in some balanced designs by an inflation of the meansquared error.An estimator of the cluster-level variance that is uniformly moreefficient than the fullmaximum likelihood is derived. Estimators of the variance ratio are alsostudied.
Resumo:
This paper studies fiscal federalism when regions differ in voters' ability to monitor publicofficials. We develop a model of political agency in which rent-seeking politicians providepublic goods to win support from heterogeneously informed voters. In equilibrium, voterinformation increases government accountability but displays decreasing returns. Therefore,political centralization reduces aggregate rent extraction when voter information varies acrossregions. It increases welfare as long as the central government is required to provide publicgoods uniformly across regions. The need for uniformity implies an endogenous trade off between reducing rents through centralization and matching idiosyncratic preferences throughdecentralization. We find that a federal structure with overlapping levels of government canbe optimal only if regional differences in accountability are sufficiently large. The modelpredicts that less informed regions should reap greater benefits when the central governmentsets a uniform policy. Consistent with our theory, we present empirical evidence that lessinformed states enjoyed faster declines in pollution after the 1970 Clean Air Act centralizedenvironmental policy at the federal level.
Resumo:
Es descriu una metodologia recent per a inferir la precipitació en el passat basada en l’anàlisi de la composició isotòpica del carboni (δ13C) en restes arqueobotàniques. Un cop descrita la base fisiològica de la tècnica, s’il·lustra l’aplicabilitat de δ13C mitjançant un exemple referent al NE peninsular. Hom pretén proporcionar una estimació quantitativa de l’evolució de la precipitació estacional (primavera) i anual al llarg dels darrers quatre mil anys basada en δ13C. Les mostres analitzades comprenen carbons (pi blanc) i llavors carbonitzades (blat i ordi), i s’obtenen estimes pluviomètriques superiors en el passat que actualment, amb una tendència gradual cap a condicions progressivament més àrides. No obstant això, aquesta tendència no esdevé uniforme, i es detecten dues fases de major precipitació (1800-900 aC; 300 aC - 300 dC) alternadament amb períodes relativament secs (900-300 aC; 900 dC - present). Dels resultats presentats també es desprèn que la importància relativa de la pluja primaveral en el passat fou variable. Des d’aproximadament el 300 aC en endavant, el període primaveral subministrà una major proporció de pluja anual que actualment. Contràriament, durant el període 1800-800 dC la seva contribució va esdevenir inferior, i va aparèixer una fase transitòria (800-300 aC) que mostra una recuperació sobtada en aportació primaveral. Posteriorment a aquesta fase la sincronia de canvis en δ13C en granes i carbons suggereix l’arribada del clima mediterrani a la regió.