963 resultados para continuous-time models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous positive airway pressure, aimed at preventing pulmonary atelectasis, has been used for decades to reduce lung injury in critically ill patients. In neonatal practice, it is increasingly used worldwide as a primary form of respiratory support due to its low cost and because it reduces the need for endotracheal intubation and conventional mechanical ventilation. We studied the anesthetized in vivo rat and determined the optimal circuit design for delivery of continuous positive airway pressure. We investigated the effects of continuous positive airway pressure following lipopolysaccharide administration in the anesthetized rat. Whereas neither continuous positive airway pressure nor lipopolysaccharide alone caused lung injury, continuous positive airway pressure applied following intravenous lipopolysaccharide resulted in increased microvascular permeability, elevated cytokine protein and mRNA production, and impaired static compliance. A dose-response relationship was demonstrated whereby higher levels of continuous positive airway pressure (up to 6 cmH(2)O) caused greater lung injury. Lung injury was attenuated by pretreatment with dexamethasone. These data demonstrate that despite optimal circuit design, continuous positive airway pressure causes significant lung injury (proportional to the airway pressure) in the setting of circulating lipopolysaccharide. Although we would currently avoid direct extrapolation of these findings to clinical practice, we believe that in the context of increasing clinical use, these data are grounds for concern and warrant further investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The -function and the -function are phenomenological models that are widely used in the context of timing interceptive actions and collision avoidance, respectively. Both models were previously considered to be unrelated to each other: is a decreasing function that provides an estimation of time-to-contact (ttc) in the early phase of an object approach; in contrast, has a maximum before ttc. Furthermore, it is not clear how both functions could be implemented at the neuronal level in a biophysically plausible fashion. Here we propose a new framework the corrected modified Tau function capable of predicting both -type ("") and -type ("") responses. The outstanding property of our new framework is its resilience to noise. We show that can be derived from a firing rate equation, and, as , serves to describe the response curves of collision sensitive neurons. Furthermore, we show that predicts the psychophysical performance of subjects determining ttc. Our new framework is thus validated successfully against published and novel experimental data. Within the framework, links between -type and -type neurons are established. Therefore, it could possibly serve as a model for explaining the co-occurrence of such neurons in the brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper is motivated by the valuation problem of guaranteed minimum death benefits in various equity-linked products. At the time of death, a benefit payment is due. It may depend not only on the price of a stock or stock fund at that time, but also on prior prices. The problem is to calculate the expected discounted value of the benefit payment. Because the distribution of the time of death can be approximated by a combination of exponential distributions, it suffices to solve the problem for an exponentially distributed time of death. The stock price process is assumed to be the exponential of a Brownian motion plus an independent compound Poisson process whose upward and downward jumps are modeled by combinations (or mixtures) of exponential distributions. Results for exponential stopping of a Lévy process are used to derive a series of closed-form formulas for call, put, lookback, and barrier options, dynamic fund protection, and dynamic withdrawal benefit with guarantee. We also discuss how barrier options can be used to model lapses and surrenders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heart transplantation (HTx) is the treatment of choice for end-stage heart failure but the limited availability of heart's donors still represents a major issue. So long-term mechanical circulatory support (MCS) has been proposed as an alternative treatment option to assist patients scheduled on HTx waiting list bridging them for a variable time period to cardiac transplantation-the so-called bridge-to-transplantation (BTT) strategy. Nowadays approximately 90% of patients being considered for MCS receive a left ventricular assist device (LVAD). In fact, LVAD experienced several improvements in the last decade and the predominance of continuous-flow over pulsatile-flow technology has been evident since 2008. The aim of the present report is to give an overview of continuous-flow LVAD utilization in the specific setting of the BTT strategy taking into consideration the most representative articles of the scientific literature and focusing the attention on the evolution, clinical outcomes, relevant implications on the HTx strategy and future perspectives of the continuous-flow LVAD technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperation and coordination are desirable behaviors that are fundamental for the harmonious development of society. People need to rely on cooperation with other individuals in many aspects of everyday life, such as teamwork and economic exchange in anonymous markets. However, cooperation may easily fall prey to exploitation by selfish individuals who only care about short- term gain. For cooperation to evolve, specific conditions and mechanisms are required, such as kinship, direct and indirect reciprocity through repeated interactions, or external interventions such as punishment. In this dissertation we investigate the effect of the network structure of the population on the evolution of cooperation and coordination. We consider several kinds of static and dynamical network topologies, such as Baraba´si-Albert, social network models and spatial networks. We perform numerical simulations and laboratory experiments using the Prisoner's Dilemma and co- ordination games in order to contrast human behavior with theoretical results. We show by numerical simulations that even a moderate amount of random noise on the Baraba´si-Albert scale-free network links causes a significant loss of cooperation, to the point that cooperation almost vanishes altogether in the Prisoner's Dilemma when the noise rate is high enough. Moreover, when we consider fixed social-like networks we find that current models of social networks may allow cooperation to emerge and to be robust at least as much as in scale-free networks. In the framework of spatial networks, we investigate whether cooperation can evolve and be stable when agents move randomly or performing Le´vy flights in a continuous space. We also consider discrete space adopting purposeful mobility and binary birth-death process to dis- cover emergent cooperative patterns. The fundamental result is that cooperation may be enhanced when this migration is opportunistic or even when agents follow very simple heuristics. In the experimental laboratory, we investigate the issue of social coordination between indi- viduals located on networks of contacts. In contrast to simulations, we find that human players dynamics do not converge to the efficient outcome more often in a social-like network than in a random network. In another experiment, we study the behavior of people who play a pure co- ordination game in a spatial environment in which they can move around and when changing convention is costly. We find that each convention forms homogeneous clusters and is adopted by approximately half of the individuals. When we provide them with global information, i.e., the number of subjects currently adopting one of the conventions, global consensus is reached in most, but not all, cases. Our results allow us to extract the heuristics used by the participants and to build a numerical simulation model that agrees very well with the experiments. Our findings have important implications for policymakers intending to promote specific, desired behaviors in a mobile population. Furthermore, we carry out an experiment with human subjects playing the Prisoner's Dilemma game in a diluted grid where people are able to move around. In contrast to previous results on purposeful rewiring in relational networks, we find no noticeable effect of mobility in space on the level of cooperation. Clusters of cooperators form momentarily but in a few rounds they dissolve as cooperators at the boundaries stop tolerating being cheated upon. Our results highlight the difficulties that mobile agents have to establish a cooperative environment in a spatial setting without a device such as reputation or the possibility of retaliation. i.e. punishment. Finally, we test experimentally the evolution of cooperation in social networks taking into ac- count a setting where we allow people to make or break links at their will. In this work we give particular attention to whether information on an individual's actions is freely available to poten- tial partners or not. Studying the role of information is relevant as information on other people's actions is often not available for free: a recruiting firm may need to call a job candidate's refer- ences, a bank may need to find out about the credit history of a new client, etc. We find that people cooperate almost fully when information on their actions is freely available to their potential part- ners. Cooperation is less likely, however, if people have to pay about half of what they gain from cooperating with a cooperator. Cooperation declines even further if people have to pay a cost that is almost equivalent to the gain from cooperating with a cooperator. Thus, costly information on potential neighbors' actions can undermine the incentive to cooperate in dynamical networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityön tarkoituksena oli selvittää kuitujen kihartuvuuden profiili koivu- ja havusellulinjalla prosessivaiheiden suhteen. Profiilin perusteella pyrittiin selvittämään kuitulinjojen prosessista kuitujen kiharuuteen voimakkaimmin vaikuttavia tekijöitä. Työn kirjallisuusosassa käsiteltiin kuitujen ominaisuuksia sekä teollisen sellun valmistusprosessin kuituihin aiheuttamia kuituvaurioita ja niiden mahdollisia syitä. Lisäksi käytiin läpi kuituvaurioiden analysointimenetelmiä ja kuituvaurioiden kanssa korreloivia sellun laatuarvoja. Työn kokeellinen osa suoritettiin kuitulinjojen normaalin tuotannon ohessa. Kokeellisen osan massanäytteitä otettiin varsin moninaisista tuotantotilanteista mahdollisimman kattavan kiharuusprofiilin aikaansaamiseksi. Koivusellulinjalla vuokeittimen tuotantotaso vaikutti suuresti valmistettavan sellun ominaisuuksiin. Kuituvaurioiden syntyyn voimakkaimmin vaikuttavaksi tekijäksi osoittautui keittimen pohja-alueen, puskun ja massan pesuvaiheen sakeus. Tämän alueen sakeus muuttui tuotantotason mukaan, tuotantotason kasvun myötä alueen sakeus laski. Sakeuden lasku vähensi kuitujen kiharuutta sekä massan venymää ja paransi vetojäykkyyttä. Tuotantotason kasvu paransi myös keittimestä otettujen massanäytteiden vetojäykkyyspotentiaalia. Havusellulinjalla käytetyn Lo-Solids –keittomallin ja konventionaalisen keittomallin kesken esiintyi varsin merkittäviä laadullisia eroja. Lo-Solids –keittomallilla keitetyt sellukuidut kihartuivat konventionaalisella keittomallilla keitettyjä sellukuituja enemmän. Lo-Solids –keittomallilla keitetyn sellun vetojäykkyys oli konventionaalisella mallilla keitetyn sellun vetojäykkyyttä heikompi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe a new Mueller matrix (MM) microscope that generalizes and makes quantitative the polarized light microscopy technique. In this instrument all the elements of the MU are simultaneously determined from the analysis in the frequency domain of the time-dependent intensity of the light beam at every pixel of the camera. The variations in intensity are created by the two compensators continuously rotating at different angular frequencies. A typical measurement is completed in a little over one minute and it can be applied to any visible wavelength. Some examples are presented to demonstrate the capabilities of the instrument.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alterations in the hepatic lipid content (HLC) and fatty acid composition are associated with disruptions in whole body metabolism, both in humans and in rodent models, and can be non-invasively assessed by (1)H-MRS in vivo. We used (1)H-MRS to characterize the hepatic fatty-acyl chains of healthy mice and to follow changes caused by streptozotocin (STZ) injection. Using STEAM at 14.1 T with an ultra-short TE of 2.8 ms, confounding effects from T2 relaxation and J-coupling were avoided, allowing for accurate estimations of the contribution of unsaturated (UFA), saturated (SFA), mono-unsaturated (MUFA) and poly-unsaturated (PUFA) fatty-acyl chains, number of double bonds, PU bonds and mean chain length. Compared with in vivo (1) H-MRS, high resolution NMR performed in vitro in hepatic lipid extracts reported longer fatty-acyl chains (18 versus 15 carbons) with a lower contribution from UFA (61 ± 1% versus 80 ± 5%) but a higher number of PU bonds per UFA (1.39 ± 0.03 versus 0.58 ± 0.08), driven by the presence of membrane species in the extracts. STZ injection caused a decrease of HLC (from 1.7 ± 0.3% to 0.7 ± 0.1%), an increase in the contribution of SFA (from 21 ± 2% to 45 ± 6%) and a reduction of the mean length (from 15 to 13 carbons) of cytosolic fatty-acyl chains. In addition, SFAs were also likely to have increased in membrane lipids of STZ-induced diabetic mice, along with a decrease of the mean chain length. These studies show the applicability of (1)H-MRS in vivo to monitor changes in the composition of the hepatic fatty-acyl chains in mice even when they exhibit reduced HLC, pointing to the value of this methodology to evaluate lipid-lowering interventions in the scope of metabolic disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The heart relies on continuous energy production and imbalances herein impair cardiac function directly. The tricarboxylic acid (TCA) cycle is the primary means of energy generation in the healthy myocardium, but direct noninvasive quantification of metabolic fluxes is challenging due to the low concentration of most metabolites. Hyperpolarized (13)C magnetic resonance spectroscopy (MRS) provides the opportunity to measure cellular metabolism in real time in vivo. The aim of this work was to noninvasively measure myocardial TCA cycle flux (VTCA) in vivo within a single minute. METHODS AND RESULTS: Hyperpolarized [1-(13)C]acetate was administered at different concentrations in healthy rats. (13)C incorporation into [1-(13)C]acetylcarnitine and the TCA cycle intermediate [5-(13)C]citrate was dynamically detected in vivo with a time resolution of 3s. Different kinetic models were established and evaluated to determine the metabolic fluxes by simultaneously fitting the evolution of the (13)C labeling in acetate, acetylcarnitine, and citrate. VTCA was estimated to be 6.7±1.7μmol·g(-1)·min(-1) (dry weight), and was best estimated with a model using only the labeling in citrate and acetylcarnitine, independent of the precursor. The TCA cycle rate was not linear with the citrate-to-acetate metabolite ratio, and could thus not be quantified using a ratiometric approach. The (13)C signal evolution of citrate, i.e. citrate formation was independent of the amount of injected acetate, while the (13)C signal evolution of acetylcarnitine revealed a dose dependency with the injected acetate. The (13)C labeling of citrate did not correlate to that of acetylcarnitine, leading to the hypothesis that acetylcarnitine formation is not an indication of mitochondrial TCA cycle activity in the heart. CONCLUSIONS: Hyperpolarized [1-(13)C]acetate is a metabolic probe independent of pyruvate dehydrogenase (PDH) activity. It allows the direct estimation of VTCA in vivo, which was shown to be neither dependent on the administered acetate dose nor on the (13)C labeling of acetylcarnitine. Dynamic (13)C MRS coupled to the injection of hyperpolarized [1-(13)C]acetate can enable the measurement of metabolic changes during impaired heart function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Time to fitness for work (TFW) was measured as the number of days that were paid as compensation for work disability during the 4 years after discharge from the rehabilitation clinic in a population of patients hospitalised for rehabilitation after orthopaedic trauma. The aim of this study was to test whether some psychological variables can be used as potential early prognostic factors of TFW. MATERIAL AND METHODS: A Cox proportional hazards model was used to estimate the associations between predictive variables and TFW. Predictors were global health, pain at hospitalisation and pain decrease during the stay (all continuous and standardised by subtracting the mean and dividing by two standard deviations), perceived severity of the trauma and expectation of a positive evolution (both binary variables). RESULTS: Full data were available for 807 inpatients (660 men, 147 women). TFW was positively associated with better perceived health (hazard ratio [HR] 1.16, 95% confidence interval [CI] 1.13-1.19), pain decrease (HR 1.46, 95% CI 1.30-1.64) and expectation of a positive evolution (HR 1.50, 95% CI 1.32-1.70) and negatively associated with pain at hospitalisation (HR 0.67, 95% CI 0.59-0.76) and high perceived severity (HR 0.72, 95% CI 0.61-0.85). DISCUSSION: The present results provide some evidence that work disability during a four-year period after rehabilitation may be predicted by prerehabilitation perceptions of general health, pain, injury severity, as well as positive expectation of evolution.