920 resultados para diffusive viscoelastic model, global weak solution, error estimate
Resumo:
ZusammenfassungDie Spurengase NOx (Stickstoffoxid (NO) und Stickstoffdioxid (NO2)) haben massgeblichen Einfluss auf die Produktion von OH (Hydroxylradikal) und Ozon (O3) in der Troposphäre. Die Bodenemissionen dieser Gase sind weitgehend unbekannt. Das Ziel dieser Arbeit war, die für die NO Bodenemissionen relevanten Prozesse durch Labor und Feldmessungen zu untersuchen und diese durch Modellsimulationen für zwei Regionen, ein tropisches Regenwaldgebiet in Rondônia (Brasilien) und subtropische Savannen in Zimbabwe abzuschätzen. Unter Verwendung der gemessenen NO Werte ergaben die Simulationen mit einem modifizierten prozessorientierten Modell, dass Abholzung in den Tropen nach einer kurzzeitigen Erhöhung zu einer langfristigen Abnahme der Bodenemissionen führt. Ein 'up scaling' der Modellresultate ergab ausgehend von der ursprünglichen Bewaldung der Region eine Verdopplung der NO Bodenemission bis 1999. Sowohl für nährstoffarme Böden der Tropen als auch für die nährstoffreichen Savannenböden waren Landnutzung und Bodenfeuchte die wichtigsten Einflussgrössen für die Regulierung der Emissionen. Über den Zeitraum eines Jahres waren die Emissionsraten der Tropen (0.49 kgNhayr-1) ungefähr halb so gross wie die der subtropischen Savannen (0.86 kgNhayr-1). Solange die Abholzung der Regenwälder voranschreitet werden die Tropen starken Einfluss auf die troposphärische Chemie haben.
Resumo:
Introduction: Nocturnal frontal lobe epilepsy (NFLE) is a distinct syndrome of partial epilepsy whose clinical features comprise a spectrum of paroxysmal motor manifestations of variable duration and complexity, arising from sleep. Cardiovascular changes during NFLE seizures have previously been observed, however the extent of these modifications and their relationship with seizure onset has not been analyzed in detail. Objective: Aim of present study is to evaluate NFLE seizure related changes in heart rate (HR) and in sympathetic/parasympathetic balance through wavelet analysis of HR variability (HRV). Methods: We evaluated the whole night digitally recorded video-polysomnography (VPSG) of 9 patients diagnosed with NFLE with no history of cardiac disorders and normal cardiac examinations. Events with features of NFLE seizures were selected independently by three examiners and included in the study only if a consensus was reached. Heart rate was evaluated by measuring the interval between two consecutive R-waves of QRS complexes (RRi). RRi series were digitally calculated for a period of 20 minutes, including the seizures and resampled at 10 Hz using cubic spline interpolation. A multiresolution analysis was performed (Daubechies-16 form), and the squared level specific amplitude coefficients were summed across appropriate decomposition levels in order to compute total band powers in bands of interest (LF: 0.039062 - 0.156248, HF: 0.156248 - 0.624992). A general linear model was then applied to estimate changes in RRi, LF and HF powers during three different period (Basal) (30 sec, at least 30 sec before seizure onset, during which no movements occurred and autonomic conditions resulted stationary); pre-seizure period (preSP) (10 sec preceding seizure onset) and seizure period (SP) corresponding to the clinical manifestations. For one of the patients (patient 9) three seizures associated with ictal asystole were recorded, hence he was treated separately. Results: Group analysis performed on 8 patients (41 seizures) showed that RRi remained unchanged during the preSP, while a significant tachycardia was observed in the SP. A significant increase in the LF component was instead observed during both the preSP and the SP (p<0.001) while HF component decreased only in the SP (p<0.001). For patient 9 during the preSP and in the first part of SP a significant tachycardia was observed associated with an increased sympathetic activity (increased LF absolute values and LF%). In the second part of the SP a progressive decrease in HR that gradually exceeded basal values occurred before IA. Bradycardia was associated with an increase in parasympathetic activity (increased HF absolute values and HF%) contrasted by a further increase in LF until the occurrence of IA. Conclusions: These data suggest that changes in autonomic balance toward a sympathetic prevalence always preceded clinical seizure onset in NFLE, even when HR changes were not yet evident, confirming that wavelet analysis is a sensitive technique to detect sudden variations of autonomic balance occurring during transient phenomena. Finally we demonstrated that epileptic asystole is associated with a parasympathetic hypertonus counteracted by a marked sympathetic activation.
Resumo:
A main objective of the human movement analysis is the quantitative description of joint kinematics and kinetics. This information may have great possibility to address clinical problems both in orthopaedics and motor rehabilitation. Previous studies have shown that the assessment of kinematics and kinetics from stereophotogrammetric data necessitates a setup phase, special equipment and expertise to operate. Besides, this procedure may cause feeling of uneasiness on the subjects and may hinder with their walking. The general aim of this thesis is the implementation and evaluation of new 2D markerless techniques, in order to contribute to the development of an alternative technique to the traditional stereophotogrammetric techniques. At first, the focus of the study has been the estimation of the ankle-foot complex kinematics during stance phase of the gait. Two particular cases were considered: subjects barefoot and subjects wearing ankle socks. The use of socks was investigated in view of the development of the hybrid method proposed in this work. Different algorithms were analyzed, evaluated and implemented in order to have a 2D markerless solution to estimate the kinematics for both cases. The validation of the proposed technique was done with a traditional stereophotogrammetric system. The implementation of the technique leads towards an easy to configure (and more comfortable for the subject) alternative to the traditional stereophotogrammetric system. Then, the abovementioned technique has been improved so that the measurement of knee flexion/extension could be done with a 2D markerless technique. The main changes on the implementation were on occlusion handling and background segmentation. With the additional constraints, the proposed technique was applied to the estimation of knee flexion/extension and compared with a traditional stereophotogrammetric system. Results showed that the knee flexion/extension estimation from traditional stereophotogrammetric system and the proposed markerless system were highly comparable, making the latter a potential alternative for clinical use. A contribution has also been given in the estimation of lower limb kinematics of the children with cerebral palsy (CP). For this purpose, a hybrid technique, which uses high-cut underwear and ankle socks as “segmental markers” in combination with a markerless methodology, was proposed. The proposed hybrid technique is different than the abovementioned markerless technique in terms of the algorithm chosen. Results showed that the proposed hybrid technique can become a simple and low-cost alternative to the traditional stereophotogrammetric systems.
Resumo:
English: The assessment of safety in existing bridges and viaducts led the Ministry of Public Works of the Netherlands to finance a specific campaing aimed at the study of the response of the elements of these infrastructures. Therefore, this activity is focused on the investigation of the behaviour of reinforced concrete slabs under concentrated loads, adopting finite element modeling and comparison with experimental results. These elements are characterized by shear behaviour and crisi, whose modeling is, from a computational point of view, a hard challeng, due to the brittle behavior combined with three-dimensional effects. The numerical modeling of the failure is studied through Sequentially Linear Analysis (SLA), an alternative Finite Element method, with respect to traditional incremental and iterative approaches. The comparison between the two different numerical techniques represents one of the first works and comparisons in a three-dimensional environment. It's carried out adopting one of the experimental test executed on reinforced concrete slabs as well. The advantage of the SLA is to avoid the well known problems of convergence of typical non-linear analysis, by directly specifying a damage increment, in terms of reduction of stiffness and resistance in particular finite element, instead of load or displacement increasing on the whole structure . For the first time, particular attention has been paid to specific aspects of the slabs, like an accurate constraints modeling and sensitivity of the solution with respect to the mesh density. This detailed analysis with respect to the main parameters proofed a strong influence of the tensile fracture energy, mesh density and chosen model on the solution in terms of force-displacement diagram, distribution of the crack patterns and shear failure mode. The SLA showed a great potential, but it requires a further developments for what regards two aspects of modeling: load conditions (constant and proportional loads) and softening behaviour of brittle materials (like concrete) in the three-dimensional field, in order to widen its horizons in these new contexts of study.
Resumo:
In this work I reported recent results in the field of Statistical Mechanics of Equilibrium, and in particular in Spin Glass models and Monomer Dimer models . We start giving the mathematical background and the general formalism for Spin (Disordered) Models with some of their applications to physical and mathematical problems. Next we move on general aspects of the theory of spin glasses, in particular to the Sherrington-Kirkpatrick model which is of fundamental interest for the work. In Chapter 3, we introduce the Multi-species Sherrington-Kirkpatrick model (MSK), we prove the existence of the thermodynamical limit and the Guerra's Bound for the quenched pressure together with a detailed analysis of the annealed and the replica symmetric regime. The result is a multidimensional generalization of the Parisi's theory. Finally we brie y illustrate the strategy of the Panchenko's proof of the lower bound. In Chapter 4 we discuss the Aizenmann-Contucci and the Ghirlanda-Guerra identities for a wide class of Spin Glass models. As an example of application, we discuss the role of these identities in the proof of the lower bound. In Chapter 5 we introduce the basic mathematical formalism of Monomer Dimer models. We introduce a Gaussian representation of the partition function that will be fundamental in the rest of the work. In Chapter 6, we introduce an interacting Monomer-Dimer model. Its exact solution is derived and a detailed study of its analytical properties and related physical quantities is performed. In Chapter 7, we introduce a quenched randomness in the Monomer Dimer model and show that, under suitable conditions the pressure is a self averaging quantity. The main result is that, if we consider randomness only in the monomer activity, the model is exactly solvable.
Resumo:
Many of developing countries are facing crisis in water management due to increasing of population, water scarcity, water contaminations and effects of world economic crisis. Water distribution systems in developing countries are facing many challenges of efficient repair and rehabilitation since the information of water network is very limited, which makes the rehabilitation assessment plans very difficult. Sufficient information with high technology in developed countries makes the assessment for rehabilitation easy. Developing countries have many difficulties to assess the water network causing system failure, deterioration of mains and bad water quality in the network due to pipe corrosion and deterioration. The limited information brought into focus the urgent need to develop economical assessment for rehabilitation of water distribution systems adapted to water utilities. Gaza Strip is subject to a first case study, suffering from severe shortage in the water supply and environmental problems and contamination of underground water resources. This research focuses on improvement of water supply network to reduce the water losses in water network based on limited database using techniques of ArcGIS and commercial water network software (WaterCAD). A new approach for rehabilitation water pipes has been presented in Gaza city case study. Integrated rehabilitation assessment model has been developed for rehabilitation water pipes including three components; hydraulic assessment model, Physical assessment model and Structural assessment model. WaterCAD model has been developed with integrated in ArcGIS to produce the hydraulic assessment model for water network. The model have been designed based on pipe condition assessment with 100 score points as a maximum points for pipe condition. As results from this model, we can indicate that 40% of water pipeline have score points less than 50 points and about 10% of total pipes length have less than 30 score points. By using this model, the rehabilitation plans for each region in Gaza city can be achieved based on available budget and condition of pipes. The second case study is Kuala Lumpur Case from semi-developed countries, which has been used to develop an approach to improve the water network under crucial conditions using, advanced statistical and GIS techniques. Kuala Lumpur (KL) has water losses about 40% and high failure rate, which make severe problem. This case can represent cases in South Asia countries. Kuala Lumpur faced big challenges to reduce the water losses in water network during last 5 years. One of these challenges is high deterioration of asbestos cement (AC) pipes. They need to replace more than 6500 km of AC pipes, which need a huge budget to be achieved. Asbestos cement is subject to deterioration due to various chemical processes that either leach out the cement material or penetrate the concrete to form products that weaken the cement matrix. This case presents an approach for geo-statistical model for modelling pipe failures in a water distribution network. Database of Syabas Company (Kuala Lumpur water company) has been used in developing the model. The statistical models have been calibrated, verified and used to predict failures for both networks and individual pipes. The mathematical formulation developed for failure frequency in Kuala Lumpur was based on different pipeline characteristics, reflecting several factors such as pipe diameter, length, pressure and failure history. Generalized linear model have been applied to predict pipe failures based on District Meter Zone (DMZ) and individual pipe levels. Based on Kuala Lumpur case study, several outputs and implications have been achieved. Correlations between spatial and temporal intervals of pipe failures also have been done using ArcGIS software. Water Pipe Assessment Model (WPAM) has been developed using the analysis of historical pipe failure in Kuala Lumpur which prioritizing the pipe rehabilitation candidates based on ranking system. Frankfurt Water Network in Germany is the third main case study. This case makes an overview for Survival analysis and neural network methods used in water network. Rehabilitation strategies of water pipes have been developed for Frankfurt water network in cooperation with Mainova (Frankfurt Water Company). This thesis also presents a methodology of technical condition assessment of plastic pipes based on simple analysis. This thesis aims to make contribution to improve the prediction of pipe failures in water networks using Geographic Information System (GIS) and Decision Support System (DSS). The output from the technical condition assessment model can be used to estimate future budget needs for rehabilitation and to define pipes with high priority for replacement based on poor condition. rn
Resumo:
We propose a method that robustly combines color and feature buffers to denoise Monte Carlo renderings. On one hand, feature buffers, such as per pixel normals, textures, or depth, are effective in determining denoising filters because features are highly correlated with rendered images. Filters based solely on features, however, are prone to blurring image details that are not well represented by the features. On the other hand, color buffers represent all details, but they may be less effective to determine filters because they are contaminated by the noise that is supposed to be removed. We propose to obtain filters using a combination of color and feature buffers in an NL-means and cross-bilateral filtering framework. We determine a robust weighting of colors and features using a SURE-based error estimate. We show significant improvements in subjective and quantitative errors compared to the previous state-of-the-art. We also demonstrate adaptive sampling and space-time filtering for animations.
Resumo:
The aim of this work is to elucidate the impact of changes in solar irradiance and energetic particles versus volcanic eruptions on tropospheric global climate during the Dalton Minimum (DM, AD 1780–1840). Separate variations in the (i) solar irradiance in the UV-C with wavelengths λ < 250 nm, (ii) irradiance at wavelengths λ > 250 nm, (iii) in energetic particle spectrum, and (iv) volcanic aerosol forcing were analyzed separately, and (v) in combination, by means of small ensemble calculations using a coupled atmosphere–ocean chemistry–climate model. Global and hemispheric mean surface temperatures show a significant dependence on solar irradiance at λ > 250 nm. Also, powerful volcanic eruptions in 1809, 1815, 1831 and 1835 significantly decreased global mean temperature by up to 0.5 K for 2–3 years after the eruption. However, while the volcanic effect is clearly discernible in the Southern Hemispheric mean temperature, it is less significant in the Northern Hemisphere, partly because the two largest volcanic eruptions occurred in the SH tropics and during seasons when the aerosols were mainly transported southward, partly because of the higher northern internal variability. In the simulation including all forcings, temperatures are in reasonable agreement with the tree ring-based temperature anomalies of the Northern Hemisphere. Interestingly, the model suggests that solar irradiance changes at λ < 250 nm and in energetic particle spectra have only an insignificant impact on the climate during the Dalton Minimum. This downscales the importance of top–down processes (stemming from changes at λ < 250 nm) relative to bottom–up processes (from λ > 250 nm). Reduction of irradiance at λ > 250 nm leads to a significant (up to 2%) decrease in the ocean heat content (OHC) between 0 and 300 m in depth, whereas the changes in irradiance at λ < 250 nm or in energetic particles have virtually no effect. Also, volcanic aerosol yields a very strong response, reducing the OHC of the upper ocean by up to 1.5%. In the simulation with all forcings, the OHC of the uppermost levels recovers after 8–15 years after volcanic eruption, while the solar signal and the different volcanic eruptions dominate the OHC changes in the deeper ocean and prevent its recovery during the DM. Finally, the simulations suggest that the volcanic eruptions during the DM had a significant impact on the precipitation patterns caused by a widening of the Hadley cell and a shift in the intertropical convergence zone.
Resumo:
BACKGROUND Estimates of the size of the undiagnosed HIV-infected population are important to understand the HIV epidemic and to plan interventions, including "test-and-treat" strategies. METHODS We developed a multi-state back-calculation model to estimate HIV incidence, time between infection and diagnosis, and the undiagnosed population by CD4 count strata, using surveillance data on new HIV and AIDS diagnoses. The HIV incidence curve was modelled using cubic splines. The model was tested on simulated data and applied to surveillance data on men who have sex with men in The Netherlands. RESULTS The number of HIV infections could be estimated accurately using simulated data, with most values within the 95% confidence intervals of model predictions. When applying the model to Dutch surveillance data, 15,400 (95% confidence interval [CI] = 15,000, 16,000) men who have sex with men were estimated to have been infected between 1980 and 2011. HIV incidence showed a bimodal distribution, with peaks around 1985 and 2005 and a decline in recent years. Mean time to diagnosis was 6.1 (95% CI = 5.8, 6.4) years between 1984 and 1995 and decreased to 2.6 (2.3, 3.0) years in 2011. By the end of 2011, 11,500 (11,000, 12,000) men who have sex with men in The Netherlands were estimated to be living with HIV, of whom 1,750 (1,450, 2,200) were still undiagnosed. Of the undiagnosed men who have sex with men, 29% (22, 37) were infected for less than 1 year, and 16% (13, 20) for more than 5 years. CONCLUSIONS This multi-state back-calculation model will be useful to estimate HIV incidence, time to diagnosis, and the undiagnosed HIV epidemic based on routine surveillance data.
Resumo:
Statistical methods are developed which assess survival data for two attributes; (1) prolongation of life, (2) quality of life. Health state transition probabilities correspond to prolongation of life and are modeled as a discrete-time semi-Markov process. Imbedded within the sojourn time of a particular health state are the quality of life transitions. They reflect events which differentiate perceptions of pain and suffering over a fixed time period. Quality of life transition probabilities are derived from the assumptions of a simple Markov process. These probabilities depend on the health state currently occupied and the next health state to which a transition is made. Utilizing the two forms of attributes the model has the capability to estimate the distribution of expected quality adjusted life years (in addition to the distribution of expected survival times). The expected quality of life can also be estimated within the health state sojourn time making more flexible the assessment of utility preferences. The methods are demonstrated on a subset of follow-up data from the Beta Blocker Heart Attack Trial (BHAT). This model contains the structure necessary to make inferences when assessing a general survival problem with a two dimensional outcome. ^
Resumo:
A high-resolution stratigraphy is essential toward deciphering climate variability in detail and understanding causality arguments of events in earth history. Because the highly dynamic middle to late Eocene provides a suitable testing ground for carbon cycle models for a waning warm world, an accurate time scale is needed to decode climate-driving mechanisms. Here we present new results from ODP Site 1260 (Leg 207) which covers a unique expanded middle Eocene section (magnetochrons C18r to C20r, late Lutetian to early Bartonian) of the tropical western Atlantic including the chron C19r transient hyperthermal event and the Middle Eocene Climate Optimum (MECO). To establish a detailed cyclostratigraphy we acquired a distinctive iron intensity records by XRF scanning Site 1260 cores. We revise the shipboard composite section, establish a cyclostratigraphy and use the exceptional eccentricity modulated precession cycles for orbital tuning. The new astrochronology revises the age of magnetic polarity chrons C19n to C20n, validates the position of very long eccentricity minima at 40.2 and 43.0 Ma in the orbital solutions, and extends the Astronomically Tuned Geological Time Scale back to 44 Ma. For the first time the new data provide clear evidence for an orbital pacing of the chron C19r event and a likely involvement of the very long eccentricity cycle contributing to the evolution of the MECO.
Resumo:
Secchi depth is a measure of water transparency. In the Baltic Sea region, Secchi depth maps are used to assess eutrophication and as input for habitat models. Due to their spatial and temporal coverage, satellite data would be the most suitable data source for such maps. But the Baltic Sea's optical properties are so different from the open ocean that globally calibrated standard models suffer from large errors. Regional predictive models that take the Baltic Sea's special optical properties into account are thus needed. This paper tests how accurately generalized linear models (GLMs) and generalized additive models (GAMs) with MODIS/Aqua and auxiliary data as inputs can predict Secchi depth at a regional scale. It uses cross-validation to test the prediction accuracy of hundreds of GAMs and GLMs with up to 5 input variables. A GAM with 3 input variables (chlorophyll a, remote sensing reflectance at 678 nm, and long-term mean salinity) made the most accurate predictions. Tested against field observations not used for model selection and calibration, the best model's mean absolute error (MAE) for daily predictions was 1.07 m (22%), more than 50% lower than for other publicly available Baltic Sea Secchi depth maps. The MAE for predicting monthly averages was 0.86 m (15%). Thus, the proposed model selection process was able to find a regional model with good prediction accuracy. It could be useful to find predictive models for environmental variables other than Secchi depth, using data from other satellite sensors, and for other regions where non-standard remote sensing models are needed for prediction and mapping. Annual and monthly mean Secchi depth maps for 2003-2012 come with this paper as Supplementary materials.
Resumo:
Micro-crystalline barites recovered by deep-sea drilling from Site 684 on the Peru margin and Site 799 in the Japan Sea are highly enriched in the heavy sulfur isotope relative to seawater ( d34S up to +84?). This isotopic composition is consistent with remobilization of biogenic barite triggered by sulfate reduction, and subsequent reprecipitation as a diagenetic barite front. The high levels of barium sulfate in these deposits (10-50%) cannot be explained by a diffusive transport model in sediments experiencing a constant rate of sedimentation. When sedimentation rates change radically, the barite front will remain at a given depth interval leading to large accumulations of barium sulfate. Such conditions may have generated the barite deposits at Site 799. At Site 684, on the other hand, there is evidence that the barite deposits are a result of the tectonically-driven advection of sulfate-bearing fluids through the sediment column.
Resumo:
Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.