936 resultados para Chemical etching method combining static etching and dynamic etching


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Networks and Complexity in Social Systems course commences with an overview of the nascent field of complex networks, dividing it into three related but distinct strands: Statistical description of large scale networks, viewed as static objects; the dynamic evolution of networks, where now the structure of the network is understood in terms of a growth process; and dynamical processes that take place on fixed networks; that is, "networked dynamical systems". (A fourth area of potential research ties all the previous three strands together under the rubric of co-evolution of networks and dynamics, but very little research has been done in this vein and so it is omitted.) The remainder of the course treats each of the three strands in greater detail, introducing technical knowledge as required, summarizing the research papers that have introduced the principal ideas, and pointing out directions for future development. With regard to networked dynamical systems, the course treats in detail the more specific topic of information propagation in networks, in part because this topic is of great relevance to social science, and in part because it has received the most attention in the literature to date.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. Fractal geometry measures the irregularity of abstract and natural objects with the fractal dimension. Fractal calculations have been applied to the structures of the human body and to quantifications in physiology from the theory of dynamic systems.Material and Methods. The fractal dimensions were calculated, the number of occupation spaces in the space border of box counting and the area of two red blood cells groups, 7 normal ones, group A, and 7 abnormal, group B, coming from patient and of bags for transfusion, were calculated using the method of box counting and a software developed for such effect. The obtained measures were compared, looking for differences between normal and abnormal red blood cells, with the purpose of differentiating samples.Results. The abnormality characterizes by a number of squares of occupation of the fractal space greater or equal to 180; values of areas between 25.117 and 33.548 correspond to normality. In case that the evaluation according to the number of pictures is of normality, must be confirmed with the value of the area applied to adjacent red blood cells within the sample, that in case of having values by outside established and/or the greater or equal spaces to 180, they suggest abnormality of the sample.Conclusions. The developed methodology is effective to differentiate the red globules alterations and probably useful in the analysis of bags of transfusion for clinical use 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resumen tomado de la publicación. Con el apoyo económico del departamento MIDE de la UNED

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.