926 resultados para Ammassi,Galassie,emissioni,non termiche,cluster,relitti,radio


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The methods of secondary wood processing are assumed to evolve over time and to affect the requirements set for the wood material and its suppliers. The study aimed at analysing the industrial operating modes applied by joinery and furniture manufacturers as sawnwood users. Industrial operating mode was defined as a pattern of important decisions and actions taken by a company which describes the company's level of adjustment in the late-industrial transition. A non-probabilistic sample of 127 companies was interviewed, including companies from Denmark, Germany, the Netherlands, and Finland. Fifty-two of the firms were furniture manufacturers and the other 75 were producing windows and doors. Variables related to business philosophy, production operations, and supplier choice criteria were measured and used as a basis for a customer typology; variables related to wood usage and perceived sawmill performance were measured to be used to profile the customer types. Factor analysis was used to determine the latent dimensions of industrial operating mode. Canonical correlations analysis was applied in developing the final base for classifying the observations. Non-hierarchical cluster analysis was employed to build a five-group typology of secondary wood processing firms; these ranged from traditional mass producers to late-industrial flexible manufacturers. There is a clear connection between the amount of late-industrial elements in a company and the share of special and customised sawnwood it uses. Those joinery or furniture manufacturers that are more late-industrial also are likely to use more component-type wood material and to appreciate customer-oriented technical precision. The results show that the change is towards the use of late-industrial sawnwood materials and late-industrial supplier relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

I. The 3.7 Å Crystal Structure of Horse Heart Ferricytochrome C.

The crystal structure of horse heart ferricytochrome c has been determined to a resolution of 3.7 Å using the multiple isomorphous replacement technique. Two isomorphous derivatives were used in the analysis, leading to a map with a mean figure of merit of 0.458. The quality of the resulting map was extremely high, even though the derivative data did not appear to be of high quality.

Although it was impossible to fit the known amino acid sequence to the calculated structure in an unambiguous way, many important features of the molecule could still be determined from the 3.7 Å electron density map. Among these was the fact that cytochrome c contains little or no α-helix. The polypeptide chain appears to be wound about the heme group in such a way as to form a loosely packed hydrophobic core in the molecule.

The heme group is located in a cleft on the molecule with one edge exposed to the solvent. The fifth coordinating ligand is His 18 and the sixth coordinating ligand is probably neither His 26 nor His 33.

The high resolution analysis of cytochrome c is now in progress and should be completed within the next year.

II. The Application of the Karle-Hauptman Tangent Formula to Protein Phasing.

The Karle-Hauptman tangent formula has been shown to be applicable to the refinement of previously determined protein phases. Tests were made with both the cytochrome c data from Part I and a theoretical structure based on the myoglobin molecule. The refinement process was found to be highly dependent upon the manner in which the tangent formula was applied. Iterative procedures did not work well, at least at low resolution.

The tangent formula worked very well in selecting the true phase from the two possible phase choices resulting from a single isomorphous replacement phase analysis. The only restriction on this application is that the heavy atoms form a non-centric cluster in the unit cell.

Pages 156 through 284 in this Thesis consist of previously published papers relating to the above two sections. References to these papers can be found on page 155.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta pesquisa avaliou a situação da tuberculose no Brasil, no período de 2001 a 2003, segundo indicadores do processo de operacionalização do Programa Nacional de Controle de Tuberculose (PNCT), e estimou os efeitos de fatores determinantes da taxa de incidência da doença. Para a avaliação utilizou-se a análise de cluster não-hierárquica, visando agrupar os municípios brasileiros de acordo com a morbidade por tuberculose (TB) e AIDS, e pelo desempenho do PNCT. Estes clusters foram mapeados, comparando-se a distribuição nos municípios, em regiões metropolitanas, municípios prioritários, e segundo o tamanho da população. O qui-quadrado de Pearson foi utilizado para testar associação nas categorias. A modelagem longitudinal multinível foi usada para identificar e estimar os efeitos dos determinantes da doença. Os agregados foram: anos, municípios e regiões metropolitanas. O modelo foi de intercepto e inclinação aleatória. Foram retidas as variáveis capazes de diminuir a variância dos níveis, pois, desta forma, explicam a variabilidade hierárquica da doença. Incluiu-se renda, densidade populacional, proporção de cura, taxa de incidência de AIDS e as grandes regiões brasileiras. A avaliação mostrou que a situação epidemiológica preocupante ocorreu nos municípios com Baixa TB e Alta AIDS, e Alta TB e AIDS. O cluster de Muito baixa TB e AIDS concentrou 50% dos municípios, o que pode configurar problemas de notificação. São 6 clusters de desempenho do programa. Bom e Bom com baixo DOTS predominando nos municípios pequenos, não prioritários e fora das regiões metropolitanas. No desempenho Moderado houve maior proporção de municípios prioritários. Clusters Regular e Fraco concentraram 10% dos municípios, com abandono de tratamento elevado e cura muito baixa. O cluster Muito Fraco caracterizou-se pela falta de dados nos indicadores de desempenho. O modelo multinível identificou a AIDS como fator impactante na tuberculose, anteriormente não encontrado em outros estudos; a interação entre renda e AIDS, e importante contribuição das regiões metropolitanas na distribuição da tuberculose, que se manifesta heterogeneamente nas grandes regiões do país. A análise discriminou municípios, e mostrou não haver associação entre maior morbidade e melhor desempenho do PNCT, retratando inadequação da vigilância à realidade epidemiológica do Brasil. O programa necessita ser reforçado, no sentido de considerar a AIDS ao estabelecer suas estratégias de controle. Ademais, os aspectos de baixa renda da população e densidade populacional, já analisados em diversas pesquisas, também se manifestaram de forma importante nestes resultados.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the Pan-STARRS1 discovery of the long-lived and blue transient PS1-11af, which was also detected by Galaxy Evolution Explorer with coordinated observations in the near-ultraviolet (NUV) band. PS1-11af is associated with the nucleus of an early type galaxy at redshift z = 0.4046 that exhibits no evidence for star formation or active galactic nucleus activity. Four epochs of spectroscopy reveal a pair of transient broad absorption features in the UV on otherwise featureless spectra. Despite the superficial similarity of these features to P-Cygni absorptions of supernovae (SNe), we conclude that PS1-11af is not consistent with the properties of known types of SNe. Blackbody fits to the spectral energy distribution are inconsistent with the cooling, expanding ejecta of a SN, and the velocities of the absorption features are too high to represent material in homologous expansion near a SN photosphere. However, the constant blue colors and slow evolution of the luminosity are similar to previous optically selected tidal disruption events (TDEs). The shape of the optical light curve is consistent with models for TDEs, but the minimum accreted mass necessary to power the observed luminosity is only 0.002 M, which points to a partial disruption model. A full disruption model predicts higher bolometric luminosities, which would require most of the radiation to be emitted in a separate component at high energies where we lack observations. In addition, the observed temperature is lower than that predicted by pure accretion disk models for TDEs and requires reprocessing to a constant, lower temperature. Three deep non-detections in the radio with the Very Large Array over the first two years after the event set strict limits on the production of any relativistic outflow comparable to Swift J1644+57, even if off-axis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several projects in the recent past have aimed at promoting Wireless Sensor Networks as an infrastructure technology, where several independent users can submit applications that execute concurrently across the network. Concurrent multiple applications cause significant energy-usage overhead on sensor nodes, that cannot be eliminated by traditional schemes optimized for single-application scenarios. In this paper, we outline two main optimization techniques for reducing power consumption across applications. First, we describe a compiler based approach that identifies redundant sensing requests across applications and eliminates those. Second, we cluster the radio transmissions together by concatenating packets from independent applications based on Rate-Harmonized Scheduling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este art??culo presenta los resultados de un estudio realizado entre los funcionarios del Ayuntamiento de Palma de Mallorca. Tiene por objeto conocer las caracter??sticas de los cargos de direcci??n, as?? como de los mandos intermedios en una administraci??n p??blica local de grandes dimensiones, y obtener grupos homog??neos de profesionales con responsabilidades de direcci??n a partir de las competencias autoevaluadas. Para ello se ha realizado un estudio transversal descriptivo, basado en encuesta autoadministrada. Se seleccionaron las 126 personas que cumpl??an las condiciones de tener responsabilidades de direcci??n. Se analiza un amplio conjunto de variables centradas en las competencias profesionales y se realizan an??lisis descriptivos diversos, entre ellos un an??lisis factorial de las competencias autoevaluadas, preparatorios del an??lisis de clusters no jer??rquicos. Los resultados indican la existencia de tres clusters diferentes y consistentes atendiendo al g??nero, edad y procedimiento de acceso a la funci??n directiva.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, approximately 29% of the world population use the Internet, against 38% in Brazil, which shows its importance in people's routine not only in Brazil, but also worldwide. Being the Internet a communication media, this research evaluates the influence of Interactivity as a factor to increase memorization of Internet sites. According to literature, multiway, immediacy and contingency factors increase Interactivity and sites that provide one or more of these factors influence memorization. 20 in-depth personal interviews were conducted to improve the understanding the issue, to identify leads and elaborate our hypothesis, followed by a quantitative survey of 300 people. Hypotheses were tested using Chi-square and a hierarchical and non-hierarchical cluster analysis. Results showed that the smaller the number of leads of a specific website, the larger are its memorization and access. The theoretical contribution of this investigation is that websites that offer fewer leads are more interactive, which causes them to be remembered. The managerial implication is that websites with a clear position and a small quantity of information or leads tend to be more remembered and accessed by internet users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Severe dengue virus (DENV) disease is associated with extensive immune activation, characterized by a cytokine storm. Previously, elevated lipopolysaccharide (LPS) levels in dengue were found to correlate with clinical disease severity. In the present cross-sectional study we identified markers of microbial translocation and immune activation, which are associated with severe manifestations of DENV infection. Methods: Serum samples from DENV-infected patients were collected during the outbreak in 2010 in the State of Sa˜o Paulo, Brazil. Levels of LPS, lipopolysaccharide binding protein (LBP), soluble CD14 (sCD14) and IgM and IgG endotoxin core antibodies were determined by ELISA. Thirty cytokines were quantified using a multiplex luminex system. Patients were classified according to the 2009 WHO classification and the occurrence of plasma leakage/shock and hemorrhage. Moreover, a (non-supervised) cluster analysis based on the expression of the quantified cytokines was applied to identify groups of patients with similar cytokine profiles. Markers of microbial translocation were linked to groups with similar clinical disease severity and clusters with similar cytokine profiles. Results: Cluster analysis indicated that LPS levels were significantly increased in patients with a profound pro-inflammatory cytokine profile. LBP and sCD14 showed significantly increased levels in patients with severe disease in the clinical classification and in patients with severe inflammation in the cluster analysis. With both the clinical classification and the cluster analysis, levels of IL-6, IL-8, sIL-2R, MCP-1, RANTES, HGF, G-CSF and EGF were associated with severe disease. Conclusions: The present study provides evidence that both microbial translocation and extensive immune activation occur during severe DENV infection and may play an important role in the pathogenesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negli ultimi anni è aumentato l’interesse dell’industria verso lo sviluppo di tecnologie alternative ai trattamenti tradizionali degli alimenti. Tra le varie tecnologie non termiche troviamo il gas plasma. Il plasma è un gas ionizzato neutro, composto da diverse particelle. I principali agenti responsabili dell’azione battericida sembrano essere le specie reattive dell’ossigeno e dell’azoto, causando danni alle cellule microbiche. Recentemente si sta studiando l’“acqua plasmata attivata”. L’obiettivo generale di questa tesi è stato quello di verificare se trattamenti al plasma di soluzioni saline (NaCl 0.9%) possano “attivarle” dotandole di attività battericida nei confronti di un ceppo di Listeria monocytogenes (ceppo 56 Ly) e di stabilire se il materiale con cui sono costruiti gli elettrodi di un generatore di plasma del tipo DBD, possa influenzare l’efficacia delle soluzioni trattate. Si sono pertanto effettuati trattamenti al plasma di soluzioni saline utilizzando elettrodi di differenti materiali: vetro, ottone, acciaio, argento; le soluzioni così ottenute sono state analizzate in termini chimici, e se ne è valutata l’azione decontaminante nei confronti di Listeria monocytogenes 56 Ly nello stesso sistema modello e, in via preliminare, in sistema reale rappresentato da carote julienne deliberatamente contaminate con L. monocytogenes. Dai risultati ottenuti si è visto che la sensibilità di L. monocytogenes 56Ly alle soluzioni acquose trattate al plasma è influenzato sia dal tipo di materiale dell’elettrodo, sia dal tempo di esposizione. L’acciaio si è rivelato il materiale più efficace. Per quanto concerne il sistema reale, il lavaggio con acqua plasmata per 60 minuti ha determinato un livello di inattivazione di circa 1 ciclo logaritmico analogamente a quanto ottenuto con la soluzione di ipoclorito. In conclusione, i risultati ottenuti hanno evidenziato una minore efficacia dei trattamenti al plasma quando applicati ai sistemi reali, ma comunque il gas plasma ha delle buone potenzialità per la decontaminazione prodotti ortofrutticoli.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An integrated approach for multi-spectral segmentation of MR images is presented. This method is based on the fuzzy c-means (FCM) and includes bias field correction and contextual constraints over spatial intensity distribution and accounts for the non-spherical cluster's shape in the feature space. The bias field is modeled as a linear combination of smooth polynomial basis functions for fast computation in the clustering iterations. Regularization terms for the neighborhood continuity of intensity are added into the FCM cost functions. To reduce the computational complexity, the contextual regularizations are separated from the clustering iterations. Since the feature space is not isotropic, distance measure adopted in Gustafson-Kessel (G-K) algorithm is used instead of the Euclidean distance, to account for the non-spherical shape of the clusters in the feature space. These algorithms are quantitatively evaluated on MR brain images using the similarity measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intensity non-uniformity (bias field) correction, contextual constraints over spatial intensity distribution and non-spherical cluster's shape in the feature space are incorporated into the fuzzy c-means (FCM) for segmentation of three-dimensional multi-spectral MR images. The bias field is modeled by a linear combination of smooth polynomial basis functions for fast computation in the clustering iterations. Regularization terms for the neighborhood continuity of either intensity or membership are added into the FCM cost functions. Since the feature space is not isotropic, distance measures, other than the Euclidean distance, are used to account for the shape and volumetric effects of clusters in the feature space. The performance of segmentation is improved by combining the adaptive FCM scheme with the criteria used in Gustafson-Kessel (G-K) and Gath-Geva (G-G) algorithms through the inclusion of the cluster scatter measure. The performance of this integrated approach is quantitatively evaluated on normal MR brain images using the similarity measures. The improvement in the quality of segmentation obtained with our method is also demonstrated by comparing our results with those produced by FSL (FMRIB Software Library), a software package that is commonly used for tissue classification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El estándar LTE se ha posicionado como una de las claves para que los operadores de telecomunicación puedan abordar de manera eficiente en costes el crecimiento de la demanda de tráfico móvil que se prevé para los próximos años, al ser una tecnología más escalable en el núcleo de la red y más flexible en la interfaz radio que sus predecesoras. En este sentido, es necesario también que los reguladores garanticen un acceso al espectro radioeléctrico adecuado, equitativo y no discriminatorio, que permita un entorno estable para el despliegue de redes de comunicaciones móviles avanzadas. Además de la flexibilización del marco regulador del espectro radioeléctrico en Europa, que ha permitido el despliegue de nuevas tecnologías en las bandas de frecuencia históricas de GSM, se ha puesto a disposición espectro adicional para sistemas IMT en nuevas bandas de frecuencia, lo que ha planteando a su vez nuevos retos para la tecnología y la regulación. La fragmentación del espectro disponible para comunicaciones móviles ha impulsado el desarrollo de técnicas de agregación de portadoras en las nuevas versiones del estándar LTE, que permiten explotar mejor los recursos radio en su conjunto. No obstante, el espectro inferior a 1 GHz sigue siendo escaso, ya que el tráfico móvil aumenta y la banda de 900 MHz aún se utiliza para servicios GSM, lo que no ha conseguido sino agravar la disputa entre los servicios de radiodifusión terrestre y de comunicaciones móviles por la parte superior de la banda UHF. En concreto, la banda de 700 MHz se perfila como una de las próximas para aumentar el espectro disponible para los servicios en movilidad, si bien su liberación por parte de las actuales redes de Televisión Digital Terrestre presenta no pocas dificultades en los Estados miembros en los que ésta es la principal plataforma audiovisual de acceso gratuito, abriendo un debate sobre el modelo audiovisual a largo plazo en Europa. Por otro lado, las políticas públicas de promoción del acceso a la banda ancha rápida y ultrarrápida de la presente década han establecido objetivos ambiciosos para el año 2020, tanto en el ámbito europeo como en los diferentes Estados miembros. La universalización del acceso a redes de banda ancha de al menos 30 Mbps constituye uno de los principales retos. Las expectativas generadas por la tecnología LTE y la puesta a disposición de nuevas bandas de frecuencia hace posible que los servicios de acceso fijo inalámbrico adquieran especial relevancia ante los objetivos de política pública establecidos que, como ha sido reconocido en diversas ocasiones, no podrán lograrse sino con un compendio de diferente tecnologías. Para esta Tesis Doctoral se han desarrollado una serie modelos tecnoeconómicos con el objetivo de realizar un análisis prospectivo que evalúa tres casos de especial relevancia en el despliegue de redes LTE: en primer lugar, la valoración económica de la banda de 700 MHz; en segundo lugar, la evaluación de modelos de negocio y reducción de costes considerando tecnologías femtocelulares; y finalmente, la viabilidad de las redes LTE de acceso fijo inalámbrico para el cierre de la brecha digital en el acceso a la banda ancha de 30 Mbps. En relación con la aplicación del análisis tecnoeconómico para la valoración del espectro de 700 MHz, los resultados obtenidos ponen de manifiesto dos cuestiones fundamentales. En primer lugar, la necesidad de asignar a los operadores más espectro para satisfacer las previsiones de demanda de tráfico móvil a medio plazo. En segundo, existe una diferencia notable en los costes de despliegue de una red LTE cuando se dispone de espectro en frecuencias inferiores a 1 GHz y cuando no, pero esta diferencia de costes disminuye a medida que se añade nuevo espectro sub-1GHz. De esta manera, la atribución de la banda de 700 MHz a servicios de comunicaciones móviles supone una reducción relevante en los costes de despliegue si el operador no dispone de espectro en la banda de 800 MHz, pero no así si ya dispone de espectro en bandas bajas para el despliegue. En este sentido, puede concluirse que el precio que los operadores estarán dispuestos a pagar por el espectro de la banda de 700 MHz dependerá de si ya tienen disponible espectro en la banda de 800 MHz. Sin embargo, dado que la competencia por ese espectro será menor, los ingresos esperables en las licitaciones de esta nueva banda serán en general menores, a pesar de que para algunos operadores este espectro sería tan valioso como el de 800 MHz. En segundo lugar, en relación con el despliegue de femtoceldas pueden extraerse algunas conclusiones en términos de ahorro de costes de despliegue y también de cara a la viabilidad de los modelos de negocio que posibilitan. El ahorro que supone la introducción de femtoceldas en el despliegue de una red LTE frente al caso de un despliegue exclusivamente macrocelular se ha demostrado que es mayor cuanto menor es el ancho de banda disponible para la red macrocelular. En esta línea, para un operador convergente el despliegue de femtoceldas tiene sentido económico si el ancho de banda disponible es escaso (en torno a 2x10 MHz), que, en el caso de España, puede reflejar el caso de los operadores del segmento fijo que son nuevos entrantes en el móvil. Por otro lado, los modelos de acceso abierto son interesantes para operadores exclusivamente móviles, porque consiguen flexibilizar los costes sustituyendo estaciones base macrocelulares por el despliegue de femtoceldas, pero necesitan desplegarse en zonas con una densidad de población relativamente elevada para que éstas descarguen tráfico de varios usuarios de la red macrocelular simultáneamente. No obstante, las femtoceldas son beneficiosas en todo caso si es el usuario quien asume los costes de la femtocelda y el backhaul, lo que sólo parece probable si se integran en el modelo de negocio de comercialización de nuevos servicios. Por tanto, el despliegue de femtoceldas en buena parte de la casuística estudiada sólo tiene sentido si consiguen aumentar los ingresos por usuario comercializando servicios de valor añadido que necesiten calidad de servicio garantizada y exploten a la vez de esa forma su principal ventaja competitiva respecto a la tecnología WiFi. Finalmente, en relación con el papel de la tecnología LTE para la provisión de servicios de acceso fijo inalámbrico para la banda ancha de 30 Mbps, se ha desarrollado un modelo TD-LTE y mediante la metodología de análisis tecnoeconómico se ha realizado un estudio prospectivo para el caso de España. Los resultados obtenidos preciden una huella de cobertura de FTTH del 74% para 2020, y demuestran que una red TD-LTE en la banda de 3,5 GHz resulta viable para aumentar la cobertura de servicios de 30 Mbps en 14 puntos porcentuales. Junto con la consideración de la cobertura de otras redes, la cobertura de 30 Mbps de acuerdo a la viabilidad de los despliegues alcanzaría el 95% en España en el año 2020. Como resumen, los resultados obtenidos muestran en todos los casos la capacidad de la tecnología LTE para afrontar nuevos desafíos en relación con el aumento del tráfico móvil, especialmente crítico en las zonas más urbanas, y el cierre de la brecha digital en el acceso a la banda ancha rápida en las zonas más rurales. ABSTRACT The LTE standard has been pointed out as one of the keys for telecom operators to address the demand growth in mobile traffic foreseen for the next years in a cost-efficient way, since its core network is more scalable and its radio interface more flexible than those of its predecessor technologies. On the other hand, regulators need to guarantee an adequate, equitable and non-discriminatory access to radio spectrum, which enable a favorable environment for the deployment of advanced mobile communication networks. Despite the reform of the spectrum regulatory framework in Europe, which allowed for the deployment of new technologies in the historic GSM bands, additional spectrum has been allocated to IMT systems in new frequency bands, what in turn has set out new challenges for technology and regulation. The current fragmentation of available spectrum in very different frequency bands has boosted the development of carrier aggregation techniques in most recent releases of the LTE standard, which permit a better exploitation of radio resources as a whole. Nonetheless, spectrum below 1 GHz is still scarce for mobile networks, since mobile traffic increases at a more rapid pace than spectral efficiency and spectrum resources. The 900 MHz frequency band is still being used for GSM services, what has worsen the dispute between mobile communication services and terrestrial broadcasting services for the upper part of the UHF band. Concretely, the 700 MHz frequency band has been pointed out as one of the next bands to be allocated to mobile in order to increase available spectrum. However, its release by current Digital Terrestrial Television networks is challenging in Member States where it constitutes the main free access audiovisual platform, opening up a new debate around the audiovisual model in the long term in Europe. On the other hand, public policies of the present decade to promote fast and ultrafast broadband access has established very ambitious objectives for the year 2020, both at European and national levels. Universalization of 30 Mbps broadband access networks constitutes one of the main challenges. Expectations raised by LTE technology and the allocation of new frequency bands has lead fixed wireless access (FWA) services to acquire special relevance in light of public policy objectives, which will not be met but with a compendium of different technologies, as different involved stakeholders have acknowledged. This PhD Dissertation develops techno-economic models to carry out a prospective analysis for three cases of special relevance in LTE networks’ deployment: the spectrum pricing of the 700 MHz frequency band, an assessment of new business models and cost reduction considering femtocell technologies, and the feasibility of LTE fixed wireless access networks to close the 30 Mbps broadband access gap in rural areas. In the first place and regarding the application of techno-economic analysis for 700 MHz spectrum pricing, obtained results reveal two core issues. First of all, the need to allocate more spectrum for operators in order to fulfill mobile traffic demand in the mid-term. Secondly, there is a substantial difference in deployment costs for a LTE network when there is sub-1GHz spectrum available and when there is not, but this difference decreases as additional sub-1GHz spectrum is added. Thus, the allocation of 700 MHz band to mobile communication services would cause a relevant reduction in deployment costs if the operator does not count on spectrum in the 800 MHz, but not if it already has been assigned spectrum in low frequencies for the deployment. In this regard, the price operators will be willing to pay for 700 MHz spectrum will depend on them having already spectrum in the 800 MHz frequency band or not. However, since competition for the new spectrum will not be so strong, expected incomes from 700 MHz spectrum awards will be generally lower than those from the digital dividend, despite this spectrum being as valuable as 800 MHz spectrum for some operators. In the second place, regarding femtocell deployment, some conclusions can be drawn in terms of deployment cost savings and also with reference to the business model they enable. Savings provided by a joint macro-femto LTE network as compared to an exclusively macrocellular deployment increase as the available bandwidth for the macrocells decreases. Therefore, for a convergent operator the deployment of femtocells can only have economic sense if the available bandwidth is scarce (around 2x10 MHz), which might be the case of fix market operators which are new entrant in mobile market. Besides, open access models are interesting for exclusively mobile operators, since they make costs more flexible by substituting macrocell base stations by femtocells, but they need to be deployed relatively densely populated areas so that they can offload traffic from several macrocell users simultaneously. Nonetheless, femtocells are beneficial in all cases if the user assumes both femtocell and backhaul costs, which only seems probable if they are integrated in a business model commercializing new services. Therefore, in many of the cases analyzed femtocell deployment only makes sense if they increase revenues per user through new added value services which need from guaranteed quality of service, thus exploiting its main competitive advantage compared to WiFi. Finally, regarding the role of LTE technology in the provision of fixed wireless access services for 30 Mbps broadband, a TD-LTE model has been developed and a prospective study has been carried out through techno-economic methodology for the Spanish case. Obtained results foresee a FTTH coverage footprint of 74% households for 2020, and prove that a TD-LTE network in the 3.5 GHz band results feasible to increase 30 Mbps service coverage in additional 14 percentage points. To sum up, obtained results show LTE technology capability to address new challenges regarding both mobile traffic growth, particularly critical in urban zones, and the current digital divide in fast broadband access in most rural zones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present study was to trace the mortality profile of the elderly in Brazil using two neighboring age groups: 60 to 69 years (young-old) and 80 years or more (oldest-old). To do this, we sought to characterize the trend and distinctions of different mortality profiles, as well as the quality of the data and associations with socioeconomic and sanitary conditions in the micro-regions of Brazil. Data was collected from the Mortality Information System (SIM) and the Brazilian Institute of Geography and Statistics (IBGE). Based on these data, the coefficients of mortality were calculated for the chapters of the International Disease Classification (ICD-10). A polynomial regression model was used to ascertain the trend of the main chapters. Non-hierarchical cluster analysis (K-Means) was used to obtain the profiles for different Brazilian micro-regions. Factorial analysis of the contextual variables was used to obtain the socio-economic and sanitary deprivation indices (IPSS). The trend of the CMId and of the ratio of its values in the two age groups confirmed a decrease in most of the indicators, particularly for badly-defined causes among the oldest-old. Among the young-old, the following profiles emerged: the Development Profile; the Modernity Profile; the Epidemiological Paradox Profile and the Ignorance Profile. Among the oldest-old, the latter three profiles were confirmed, in addition to the Low Mortality Rates Profile. When comparing the mean IPSS values in global terms, all of the groups were different in both of the age groups. The Ignorance Profile was compared with the other profiles using orthogonal contrasts. This profile differed from all of the others in isolation and in clusters. However, the mean IPSS was similar for the Low Mortality Rates Profile among the oldest-old. Furthermore, associations were found between the data quality indicators, the CMId for badly-defined causes, the general coefficient of mortality for each age group (CGMId) and the IPSS of the micro-regions. The worst rates were recorded in areas with the greatest socioeconomic and sanitary deprivation. The findings of the present study show that, despite the decrease in the mortality coefficients, there are notable differences in the profiles related to contextual conditions, including regional differences in data quality. These differences increase the vulnerability of the age groups studied and the health iniquities that are already present.

Relevância:

70.00% 70.00%

Publicador: