907 resultados para Time-invariant Wavelet Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

TRMM Microwave Imager (TMI) is reported to be a useful sensor to measure the atmospheric and oceanic parameters even in cloudy conditions. Vertically integrated specific humidity, Total Precipitable Water (TPW) retrieved from the water vapour absorption channel (22GHz.) along with 10m wind speed and rain rate derived from TMI is used to investigate the moisture variation over North Indian Ocean. Intraseasonal Oscillations (ISO) of TPW during the summer monsoon seasons 1998, 1999, and 2000 over North Indian Ocean is explored using wavelet analysis. The dominant waves in TPW during the monsoon periods and the differences in ISO over Arabian Sea and Bay of Bengal are investigated. The northward propagation of TPW anomaly and its coherence with the coastal rainfall is also studied. For the diagnostic study of heavy rainfall spells over the west coast, the intrusion of TPW over the North Arabian Sea is seen to be a useful tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Active microwave imaging is explored as an imaging modality for early detection of breast cancer. When exposed to microwaves, breast tumor exhibits electrical properties that are significantly different from that of healthy breast tissues. The two approaches of active microwave imaging — confocal microwave technique with measured reflected signals and microwave tomographic imaging with measured scattered signals are addressed here. Normal and malignant breast tissue samples of same person are subjected to study within 30 minutes of mastectomy. Corn syrup is used as coupling medium, as its dielectric parameters show good match with that of the normal breast tissue samples. As bandwidth of the transmitter is an important aspect in the time domain confocal microwave imaging approach, wideband bowtie antenna having 2:1 VSWR bandwidth of 46% is designed for the transmission and reception of microwave signals. Same antenna is used for microwave tomographic imaging too at the frequency of 3000 MHz. Experimentally obtained time domain results are substantiated by finite difference time domain (FDTD) analysis. 2-D tomographic images are reconstructed with the collected scattered data using distorted Born iterative method. Variations of dielectric permittivity in breast samples are distinguishable from the obtained permittivity profiles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis entitled "Studies on improved practices of prawn farming for higher production in central Kerala" prepared by the author describes various practices prevailing in the study area in order to elucidate their relative merits. The study on semi-intensive farming at Mundapuram, Kannur was also carried out and included in the thesis for comparison.The author felt it important to make a critical study of the existing culture practices in the central Kerala, a region where it has been existing since time immemorial.Careful analysis of data accrued by the author has helped him to identify strength, weakness, opportunities and threats confronting the shrimp farming. As a result it was possible to evolve an appropriate management technology taking into consideration the various ecological (location specific), social and economical conditions prevalent in the vast study area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducing fishing pressure in coastal waters is the need of the day in the Indian marine fisheries sector of the country which is fast changing from a mere vocational activity to a capital intensive industry. It requires continuous monitoring of the resource exploitation through a scientifically acceptable methodology, data on production of each species stock, the number and characteristics of the fishing gears of the fleet, various biological characteristics of each stock, the impact of fishing on the environment and the role of fishery—independent on availability and abundance. Besides this, there are issues relating to capabilities in stock assessment, taxonomy research, biodiversity, conservation and fisheries management. Generation of reliable data base over a fixed time frame, their analysis and interpretation are necessary before drawing conclusions on the stock size, maximum sustainable yield, maximum economic yield and to further implement various fishing regulatory measures. India being a signatory to several treaties and conventions, is obliged to carry out assessments of the exploited stocks and manage them at sustainable levels. Besides, the nation is bound by its obligation of protein food security to people and livelihood security to those engaged in marine fishing related activities. Also, there are regional variabilities in fishing technology and fishery resources. All these make it mandatory for India to continue and strengthen its marine capture fisheries research in general and deep sea fisheries in particular. Against this background, an attempt is made to strengthen the deep sea fish biodiversity and also to generate data on the distribution, abundance, catch per unit effort of fishery resources available beyond 200 m in the EEZ of southwest coast ofIndia and also unravel some of the aspects of life history traits of potentially important non conventional fish species inhabiting in the depth beyond 200 m. This study was carried out as part of the Project on Stock Assessment and Biology of Deep Sea Fishes of Indian EEZ (MoES, Govt. of India).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The marine atmospheric boundary layer (MABL) plays a vital role in the transport of momentum and heat from the surface of the ocean into the atmosphere. A detailed study on the MABL characteristics was carried out using high-resolution surface-wind data as measured by the QuikSCAT (Quick scatterometer) satellite. Spatial variations in the surface wind, frictional velocity, roughness parameter and drag coe±cient for the di®erent seasons were studied. The surface wind was strong during the southwest monsoon season due to the modulation induced by the Low Level Jetstream. The drag coe±cient was larger during this season, due to the strong winds and was lower during the winter months. The spatial variations in the frictional velocity over the seas was small during the post-monsoon season (»0.2 m s¡1). The maximum spatial variation in the frictional velocity was found over the south Arabian Sea (0.3 to 0.5 m s¡1) during the southwest monsoon period, followed by the pre-monsoon over the Bay of Bengal (0.1 to 0.25 m s¡1). The mean wind-stress curl during the winter was positive over the equatorial region, with a maximum value of 1.5£10¡7 N m¡3, but on either side of the equatorial belt, a negative wind-stress curl dominated. The area average of the frictional velocity and drag coe±cient over the Arabian Sea and Bay of Bengal were also studied. The values of frictional velocity shows a variability that is similar to the intraseasonal oscillation (ISO) and this was con¯rmed via wavelet analysis. In the case of the drag coe±cient, the prominent oscillations were ISO and quasi-biweekly mode (QBM). The interrelationship between the drag coe±cient and the frictional velocity with wind speed in both the Arabian Sea and the Bay of Bengal was also studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The basic concepts of digital signal processing are taught to the students in engineering and science. The focus of the course is on linear, time invariant systems. The question as to what happens when the system is governed by a quadratic or cubic equation remains unanswered in the vast majority of literature on signal processing. Light has been shed on this problem when John V Mathews and Giovanni L Sicuranza published the book Polynomial Signal Processing. This book opened up an unseen vista of polynomial systems for signal and image processing. The book presented the theory and implementations of both adaptive and non-adaptive FIR and IIR quadratic systems which offer improved performance than conventional linear systems. The theory of quadratic systems presents a pristine and virgin area of research that offers computationally intensive work. Once the area of research is selected, the next issue is the choice of the software tool to carry out the work. Conventional languages like C and C++ are easily eliminated as they are not interpreted and lack good quality plotting libraries. MATLAB is proved to be very slow and so do SCILAB and Octave. The search for a language for scientific computing that was as fast as C, but with a good quality plotting library, ended up in Python, a distant relative of LISP. It proved to be ideal for scientific computing. An account of the use of Python, its scientific computing package scipy and the plotting library pylab is given in the appendix Initially, work is focused on designing predictors that exploit the polynomial nonlinearities inherent in speech generation mechanisms. Soon, the work got diverted into medical image processing which offered more potential to exploit by the use of quadratic methods. The major focus in this area is on quadratic edge detection methods for retinal images and fingerprints as well as de-noising raw MRI signals

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a theory which permits for the first time a detailed analysis of the dependence of the absorption spectrum on atomic structure and cluster size. Thus, we determine the development of the collective excitations in small clusters and show that their broadening depends sensitively on the tomic structure, in particular at the surface. Results for Hg_n^+ clusters show that the plasmon energy is close to its jellium value in the case of spherical-like structures, but is in general between w_p/ \wurzel{3} and w_p/ \wurzel{2} for compact clusters. A particular success of our theory is the identification of the excitations contributing to the absorption peaks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes a proposed release controlmethodology, WIPLOAD Control (WIPLCtrl), using a transfer line case modeled by Markov process modeling methodology. The performance of WIPLCtrl is compared with that of CONWIP under 13 system configurations in terms of throughput, average inventory level, as well as average cycle time. As a supplement to the analytical model, a simulation model of the transfer line is used to observe the performance of the release control methodologies on the standard deviation of cycle time. From the analysis, we identify the system configurations in which the advantages of WIPLCtrl could be observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper considers a connection between the deterministic and noisy behavior of nonlinear networks. Specifically, a particular bridge circuit is examined which has two possibly nonlinear energy storage elements. By proper choice of the constitutive relations for the network elements, the deterministic terminal behavior reduces to that of a single linear resistor. This reduction of the deterministic terminal behavior, in which a natural frequency of a linear circuit does not appear in the driving-point impedance, has been shown in classical circuit theory books (e.g. [1, 2]). The paper shows that, in addition to the reduction of the deterministic behavior, the thermal noise at the terminals of the network, arising from the usual Nyquist-Johnson noise model associated with each resistor in the network, is also exactly that of a single linear resistor. While this result for the linear time-invariant (LTI) case is a direct consequence of a well-known result for RLC circuits, the nonlinear result is novel. We show that the terminal noise current is precisely that predicted by the Nyquist-Johnson model for R if the driving voltage is zero or constant, but not if the driving voltage is time-dependent or the inductor and capacitor are time-varying

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La figura del campesino, primero en Europa y posteriormente en el mundo, por lo general ha estado ligada a estereotipos asociados a la marginalidad, la pobreza y el atraso. Estas representaciones poco a poco han tomado fuerza en nuestra sociedad y parecen inmodificables a través del tiempo. Lo que hace interesante su análisis es el hecho de que en la mayoría de los casos las representaciones no corresponden a la realidad, son contradictorias y se dan como resultado de lecturas poco objetivas construidas por parte de unos pocos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Piecewise linear models systems arise as mathematical models of systems in many practical applications, often from linearization for nonlinear systems. There are two main approaches of dealing with these systems according to their continuous or discrete-time aspects. We propose an approach which is based on the state transformation, more particularly the partition of the phase portrait in different regions where each subregion is modeled as a two-dimensional linear time invariant system. Then the Takagi-Sugeno model, which is a combination of local model is calculated. The simulation results show that the Alpha partition is well-suited for dealing with such a system

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper analyzes innovative psychiatric practices that took place in Argentina during the sixties and seventies at the Hospital Jose Esteves in the province of Buenos Aires. Objective: To present the coexistence of different paradigms related to mental health in the same institution and to analyze the complexities generated by this scenario. Methodology: This study uses primary sources in the form of medical records of patients admitted to the hospital between 1960 and 1979. The medical records were cross-referenced with publications of newspapers and magazines of the time. Results: The analysis shows that the political environment during the era of militarydictatorship —characterized by ideological persecution and the inhibition of political expression— influenced the development of innovative psychiatric practices. At the same time, instances ofanti-Semitism and ideological persecution among health workers affected therapeutic approaches. Conclusions: While the introduction of innovative practices in mental health led to someresistance among the more orthodox psychiatrists, the presence of different paradigms shows a plan, both political and professional, to transform psychiatry and admission policy in Argentina.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los gliomas malignos representan una de las formas más agresivas de los tumores del sistema nervioso central (SNC). De acuerdo con la clasificación de los tumores cerebrales de la Organización Mundial de la Salud (OMS), los astrocitomas han sido categorizados en cuatro grados, determinados por la patología subyacente. Es así como los gliomas malignos (o de alto grado) incluyen el glioma anaplásico (grado III) así como el glioblastoma multiforme (GBM, grado IV),estos últimos los más agresivos con el peor pronóstico (1). El manejo terapéutico de los tumores del SNC se basa en la cirugía, la radioterapia y la quimioterapia, dependiendo de las características del tumor, el estadio clínico y la edad (2),(3), sin embargo ninguno de los tratamientos estándar es completamente seguro y compatible con una calidad de vida aceptable (3), (4). En general, la quimioterapia es la primera opción en los tumores diseminados, como el glioblastoma invasivo y el meduloblastoma de alto riesgo o con metástasis múltiple, pero el pronóstico en estos pacientes es muy pobre (2),(3). Solamente nuevas terapias dirigidas (2) como las terapias anti-angiogénicas (4); o terapias génicas muestran un beneficio real en grupos limitados de pacientes con defectos moleculares específicos conocidos (4). De este modo, se hace necesario el desarrollo de nuevas terapias farmacológicas para atacar los tumores cerebrales. Frente a las terapias los gliomas malignos son con frecuencia quimioresistentes, y esta resistencia parece depender de al menos dos mecanismos: en primer lugar, la pobre penetración de muchas drogas anticáncer a través de la barrera hematoencefálica (BBB: Blood Brain Barrier), la barrera del fluido sangre-cerebroespinal (BCSFB: Blood-cerebrospinal fluid barrier) y la barrera sangre-tumor (BTB: blood-tumor barrier). Dicha resistencia se debe a la interacción de la droga con varios transportadores o bombas de eflujo de droga ABC (ABC: ATP-binding cassette) que se sobre expresan en las células endoteliales o epiteliales de estas barreras. En segundo lugar, estos transportadores de eflujo de drogas ABC propios de las células tumorales confieren un fenotipo conocido como resistencia a multidrogas (MDR: multidrug resistance), el cual es característico de varios tumores sólidos. Este fenotipo también está presente en los tumores del SNC y su papel en gliomas es objeto de investigación (5). Por consiguiente el suministro de medicamentos a través de la BBB es uno de los problemas vitales en los tratamientos de terapia dirigida. Estudios recientes han demostrado que algunas moléculas pequeñas utilizadas en estas terapias son sustratos de la glicoproteína P (Pgp: P-gycoprotein), así como también de otras bombas de eflujo como las proteínas relacionadas con la resistencia a multidrogas (MRPs: multidrug resistance-related proteins (MRPs) o la proteína relacionada con cáncer de seno (BCRP: breast-cancer resistance related protein)) que no permiten que las drogas de este tipo alcancen el tumor (1). Un sustrato de Pgp y BCRP es la DOXOrubicina (DOXO), un fármaco utilizado en la terapia anti cáncer, el cual es muy eficaz para atacar las células del tumor cerebral in vitro, pero con un uso clínico limitado por la poca entrega a través de la barrera hematoencefálica (BBB) y por la resistencia propia de los tumores. Por otra parte las células de BBB y las células del tumor cerebral tienen también proteínas superficiales, como el receptor de la lipoproteína de baja densidad (LDLR), que podría utilizarse como blanco terapéutico en BBB y tumores cerebrales. Es asi como la importancia de este estudio se basa en la generación de estrategias terapéuticas que promuevan el paso de las drogas a través de la barrera hematoencefalica y tumoral, y a su vez, se reconozcan mecanismos celulares que induzcan el incremento en la expresión de los transportadores ABC, de manera que puedan ser utilizados como blancos terapéuticos.Este estudio demostró que el uso de una nueva estrategia basada en el “Caballo de Troya”, donde se combina la droga DOXOrubicina, la cual es introducida dentro de un liposoma, salvaguarda la droga de manera que se evita su reconocimiento por parte de los transportadores ABC tanto de la BBB como de las células del tumor. La construcción del liposoma permitió utilizar el receptor LDLR de las células asegurando la entrada a través de la BBB y hacia las células tumorales a través de un proceso de endocitosis. Este mecanismo fue asociado al uso de estatinas o drogas anticolesterol las cuales favorecieron la expresión de LDLR y disminuyeron la actividad de los transportadores ABC por nitración de los mismos, incrementando la eficiencia de nuestro Caballo de Troya. Por consiguiente demostramos que el uso de una nueva estrategia o formulación denominada ApolipoDOXO más el uso de estatinas favorece la administración de fármacos a través de la BBB, venciendo la resistencia del tumor y reduciendo los efectos colaterales dosis dependiente de la DOXOrubicina. Además esta estrategia del "Caballo de Troya", es un nuevo enfoque terapéutico que puede ser considerado como una nueva estrategia para aumentar la eficacia de diferentes fármacos en varios tumores cerebrales y garantiza una alta eficiencia incluso en un medio hipóxico,característico de las células cancerosas, donde la expresión del transportador Pgp se vió aumentada. Teniendo en cuenta la relación entre algunas vías de señalización reconocidas como moduladores de la actividad de Pgp, este estudio presenta no solo la estrategia del Caballo de Troya, sino también otra propuesta terapéutica relacionada con el uso de Temozolomide más DOXOrubicina. Esta estrategia demostró que el temozolomide logra penetrar la BBB por que interviene en la via de señalización de la Wnt/GSK3/β-catenina, la cual modula la expresión del transportador Pgp. Se demostró que el TMZ disminuye la proteína y el mRNA de Wnt3 permitiendo plantear la hipótesis de que la droga al disminuir la transcripción del gen Wnt3 en células de BBB, incrementa la activación de la vía fosforilando la β-catenina y conduciendo a disminuir la β-catenina nuclear y por tanto su unión al promotor del gen mdr1. Con base en los resultados este estudio permitió el reconocimiento de tres mecanismos básicos relacionados con la expresión de los transportadores ABC y asociados a las estrategias empleadas: el primero fue el uso de las estatinas, el cual condujo a la nitración de los transportadores disminuyendo su actividad por la via del factor de transcripción NFκB; el segundo a partir del uso del temozolomide, el cual metila el gen de Wnt3 reduciendo la actividad de la via de señalización de la la β-catenina, disminuyendo la expresión del transportador Pgp. El tercero consistió en la determinación de la relación entre el eje RhoA/RhoA quinasa como un modulador de la via (no canónica) GSK3/β-catenina. Se demostró que la proteína quinasa RhoA promovió la activación de la proteína PTB1, la cual al fosforilar a GSK3 indujo la fosforilación de la β-catenina, lo cual dio lugar a su destrucción por el proteosoma, evitando su unión al promotor del gen mdr1 y por tanto reduciendo su expresión. En conclusión las estrategias propuestas en este trabajo incrementaron la citotoxicidad de las células tumorales al aumentar la permeabilidad no solo de la barrera hematoencefálica, sino también de la propia barrera tumoral. Igualmente, la estrategia del “Caballo de Troya” podría ser útil para la terapia de otras enfermedades asociadas al sistema nervioso central. Por otra parte estos estudios indican que el reconocimiento de mecanismos asociados a la expresión de los transportadores ABC podría constituir una herramienta clave en el desarrollo de nuevas terapias anticáncer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les noves tecnologies a la xarxa ens permeten transportar, cada cop més, grans volums d' informació i trànsit de xarxa amb diferents nivells de prioritat. En aquest escenari, on s'ofereix una millor qualitat de servei, les conseqüències d'una fallada en un enllaç o en un node esdevenen més importants. Multiprotocol Lavel Switching (MPLS), juntament amb l'extensió a MPLS generalitzat (GMPLS), proporcionen mecanismes ràpids de recuperació de fallada establint camins, Label Switch Path (LSPs), redundants per ser utilitzats com a camins alternatius. En cas de fallada podrem utilitzar aquests camins per redireccionar el trànsit. El principal objectiu d'aquesta tesi ha estat millorar alguns dels actuals mecanismes de recuperació de fallades MPLS/GMPLS, amb l'objectiu de suportar els requeriments de protecció dels serveis proporcionats per la nova Internet. Per tal de fer aquesta avaluació s'han tingut en compte alguns paràmetres de qualitat de protecció com els temps de recuperació de fallada, les pèrdues de paquets o el consum de recursos. En aquesta tesi presentem una completa revisió i comparació dels principals mètodes de recuperació de fallada basats en MPLS. Aquest anàlisi inclou els mètodes de protecció del camí (backups globals, backups inversos i protecció 1+1), els mètodes de protecció locals i els mètodes de protecció de segments. També s'ha tingut en compte l'extensió d'aquests mecanismes a les xarxes òptiques mitjançant el pla de control proporcionat per GMPLS. En una primera fase d'aquest treball, cada mètode de recuperació de fallades és analitzat sense tenir en compte restriccions de recursos o de topologia. Aquest anàlisi ens dóna una primera classificació dels millors mecanismes de protecció en termes de pèrdues de paquets i temps de recuperació. Aquest primer anàlisi no és aplicable a xarxes reals. Per tal de tenir en compte aquest nou escenari, en una segona fase, s'analitzen els algorismes d'encaminament on sí tindrem en compte aquestes limitacions i restriccions de la xarxa. Es presenten alguns dels principals algorismes d'encaminament amb qualitat de servei i alguna de les principals propostes d'encaminament per xarxes MPLS. La majoria dels actual algorismes d'encaminament no tenen en compte l'establiment de rutes alternatives o utilitzen els mateixos objectius per seleccionar els camins de treball i els de protecció. Per millorar el nivell de protecció introduïm i formalitzem dos nous conceptes: la Probabilitat de fallada de la xarxa i l'Impacte de fallada. Un anàlisi de la xarxa a nivell físic proporciona un primer element per avaluar el nivell de protecció en termes de fiabilitat i disponibilitat de la xarxa. Formalitzem l'impacte d'una fallada, quant a la degradació de la qualitat de servei (en termes de retard i pèrdues de paquets). Expliquem la nostra proposta per reduir la probabilitat de fallada i l'impacte de fallada. Per últim fem una nova definició i classificació dels serveis de xarxa segons els valors requerits de probabilitat de fallada i impacte. Un dels aspectes que destaquem dels resultats d'aquesta tesi és que els mecanismes de protecció global del camí maximitzen la fiabilitat de la xarxa, mentre que les tècniques de protecció local o de segments de xarxa minimitzen l'impacte de fallada. Per tant podem assolir mínim impacte i màxima fiabilitat aplicant protecció local a tota la xarxa, però no és una proposta escalable en termes de consum de recursos. Nosaltres proposem un mecanisme intermig, aplicant protecció de segments combinat amb el nostre model d'avaluació de la probabilitat de fallada. Resumint, aquesta tesi presenta diversos mecanismes per l'anàlisi del nivell de protecció de la xarxa. Els resultats dels models i mecanismes proposats milloren la fiabilitat i minimitzen l'impacte d'una fallada en la xarxa.