993 resultados para CONTACT APPLICATIONS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carcinoma-associated fibroblasts were reported to promote colorectal cancer (CRC) invasion by secreting motility factors and extracellular matrix processing enzymes. Less is known whether fibroblasts may induce CRC cancer cell motility by contact-dependent mechanisms. To address this question we characterized the interaction between fibroblasts and SW620 and HT29 colorectal cancer cells in 2D and 3D co-culture models in vitro. Here we show that fibroblasts induce contact-dependent cancer cell elongation, motility and invasiveness independently of deposited matrix or secreted factors. These effects depend on fibroblast cell surface-associated fibroblast growth factor (FGF) -2. Inhibition of FGF-2 or FGF receptors (FGFRs) signaling abolishes these effects. FGFRs activate SRC in cancer cells and inhibition or silencing of SRC in cancer cells, but not in fibroblasts, prevents fibroblasts-mediated effects. Using an RGD-based integrin antagonist and function-blocking antibodies we demonstrate that cancer cell adhesion to fibroblasts requires integrin αvβ5. Taken together, these results demonstrate that fibroblasts induce cell-contact-dependent colorectal cancer cell migration and invasion under 2D and 3D conditions in vitro through fibroblast cell surface-associated FGF-2, FGF receptor-mediated SRC activation and αvβ5 integrin-dependent cancer cell adhesion to fibroblasts. The FGF-2-FGFRs-SRC-αvβ5 integrin loop might be explored as candidate therapeutic target to block colorectal cancer invasion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Internet-palvelujen määrä kasvaa jatkuvasti. Henkilöllä on yleensä yksi sähköinen identiteetti jokaisessa käyttämässään palvelussa. Autentikointitunnusten turvallinen säilytys käy yhä vaikeammaksi, kun niitä kertyy yhdet jokaisesta uudesta palvelurekisteröitymisestä. Tämä diplomityö tarkastelee ongelmaa ja ratkaisuja sekä palvelulähtöisestä että teknisestä näkökulmasta. Palvelulähtöisen identiteetinhallinnan liiketoimintakonsepti ja toteutustekniikat – kuten single sign-on (SSO) ja Security Assertion Markup Language (SAML) – käydään läpi karkeiden esimerkkien avulla sekä tutustuen Nokia Account -hankkeessa tuotetun ratkaisun konseptiin ja teknisiin yksityiskohtiin. Nokia Account -palvelun ensimmäisen version toteutusta analysoidaan lopuksi identiteetinhallintapalveluiden suunnitteluperiaatteita ja vaatimuksia vasten.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis analyses the calculation of FanSave and PumpSave energy saving tools calculation. With these programs energy consumption of variable speed drive control for fans and pumps can be compared to other control methods. With FanSave centrifugal and axial fans can be examined and PumpSave deals with centrifugal pumps. By means of these programs also suitable frequency converter can be chosen from the ABB collection. Programs need as initial values information about the appliances like amount of flow and efficiencies. Operation time is important factor when calculating the annual energy consumption and information about it are the length and profile. Basic theory related to fans and pumps is introduced without more precise instructions for dimensioning. FanSave and PumpSave contain various methods for flow control. These control methods are introduced in the thesis based on their operational principles and suitability. Also squirrel cage motor and frequency converter are introduced because of their close involvement to fans and pumps. Second part of the thesis contains comparison between results of FanSave’s and PumpSave’s calculation and performance curve based calculation. Also laboratory tests were made with centrifugal and axial fan and also with centrifugal pump. With the results from this thesis the calculation of these programs can be adjusted to be more accurate and also some new features can be added.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fetoscopic coagulation of placental anastomoses is the treatment of choice for severe twin-to-twin transfusion syndrome. In the present day, fetal laser therapy is also used to treat amniotic bands, chorioangiomas, sacrococcygeal teratomas, lower urinary tract obstructions and chest masses, all of which will be reviewed in this article. Amniotic band syndrome can cause limb amputation by impairing downstream blood flow. Large chorioangiomas (>4 cm), sacrococcygeal teratomas or fetal hyperechoic lung lesions can lead to fetal compromise and hydrops by vascular steal phenomenon or compression. Renal damage, bladder dysfunction and lastly death because of pulmonary hypolasia may be the result of megacystis caused by a posterior urethral valve. The prognosis of these pathologies can be dismal, and therapy options are limited, which has brought fetal laser therapy to the forefront. Management options discussed here are laser release of amniotic bands, laser coagulation of the placental or fetal tumor feeding vessels and laser therapy by fetal cystoscopy. This review, largely based on case reports, does not intend to provide a level of evidence supporting laser therapy over other treatment options. Centralized evaluation by specialists using strict selection criteria and long-term follow-up of these rare cases are now needed to prove the value of endoscopic or ultrasound-guided laser therapy.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Very large molecular systems can be calculated with the so called CNDOL approximate Hamiltonians that have been developed by avoiding oversimplifications and only using a priori parameters and formulas from the simpler NDO methods. A new diagonal monoelectronic term named CNDOL/21 shows great consistency and easier SCF convergence when used together with an appropriate function for charge repulsion energies that is derived from traditional formulas. It is possible to obtain a priori molecular orbitals and electron excitation properties after the configuration interaction of single excited determinants with reliability, maintaining interpretative possibilities even being a simplified Hamiltonian. Tests with some unequivocal gas phase maxima of simple molecules (benzene, furfural, acetaldehyde, hexyl alcohol, methyl amine, 2,5 dimethyl 2,4 hexadiene, and ethyl sulfide) ratify the general quality of this approach in comparison with other methods. The calculation of large systems as porphine in gas phase and a model of the complete retinal binding pocket in rhodopsin with 622 basis functions on 280 atoms at the quantum mechanical level show reliability leading to a resulting first allowed transition in 483 nm, very similar to the known experimental value of 500 nm of "dark state." In this very important case, our model gives a central role in this excitation to a charge transfer from the neighboring Glu(-) counterion to the retinaldehyde polyene chain. Tests with gas phase maxima of some important molecules corroborate the reliability of CNDOL/2 Hamiltonians.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optical tweezers are an innovative technique for the non-contact, all-optical manipulation of small material samples, which has extraordinarily expanded and evolved since its inception in the mid-80s of the last century. Nowadays, the potential of optical tweezers has been clearly proven and a wide range of applications both from the physical and biological sciences have solidly emerged, turning the early ideas and techniques into a powerful paradigm for experimentation in the micro- and nanoworld. This review aims at highlighting the fundamental concepts that are essential for a thorough understanding of optical trapping, making emphasis on both its manipulation and measurement capabilities, as well as on the vast array of important biological applications appeared in the last years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animal societies rely on interactions between group members to effectively communicate and coordinate their actions. To date, the transmission properties of interaction networks formed by direct physical contacts have been extensively studied for many animal societies and in all cases found to inhibit spreading. Such direct interactions do not, however, represent the only viable pathways. When spreading agents can persist in the environment, indirect transmission via 'same-place, different-time' spatial coincidences becomes possible. Previous studies have neglected these indirect pathways and their role in transmission. Here, we use rock ant colonies, a model social species whose flat nest geometry, coupled with individually tagged workers, allowed us to build temporally and spatially explicit interaction networks in which edges represent either direct physical contacts or indirect spatial coincidences. We show how the addition of indirect pathways allows the network to enhance or inhibit the spreading of different types of agent. This dual-functionality arises from an interplay between the interaction-strength distribution generated by the ants' movement and environmental decay characteristics of the spreading agent. These findings offer a general mechanism for understanding how interaction patterns might be tuned in animal societies to control the simultaneous transmission of harmful and beneficial agents.