941 resultados para end user computing application streaming horizon workspace portalvmware view
Resumo:
More than seventy years after their initial characterisation, the aetiology of inflammatory bowel diseases remains elusive. A recent review evaluating the incidence trends of the last 25 years concluded that an increasing incidence has been observed almost worldwide. A north-south gradient is still found in Europe. Genetic associations are variably reproduced worldwide and indicate a strong impact of environmental factors. Tumour necrosis factor alpha (TNF-alpha) has been shown to play a critical role in the pathogenesis of inflammatory bowel disease (IBD). TNF-alpha blockers are biological agents that specifically target this key cytokine in the inflammatory process and have become a mainstay in the therapy of inflammatory bowel diseases. This paper reviews the necessary investigations before using such agents, the use of such agents in pregnancy and lactation, the role of co-immunosuppression, how to monitor efficacy and safety, dose-adaptation, and the decision as to when to switch to another TNF-alpha blocker. Finally it gives recommendations for special situations. Currently there are three TNF-alpha blockers available for clinical use in IBD in Switzerland: infliximab (Remicade), adalimumab (Humira) and certolizumab pegol (Cimzia). Infliximab is a chimeric monoclonal antibody composed of a human IgG1 constant region and a murine variable region and is administered intravenously. Adalimumab is a humanised monoclonal antibody, with both human IgG1 constant and variable regions. Certolizumab pegol is a pegylated, humanised monoclonal anti-TNF fragment antigen binding fragment. Both adalimumab and certolizumab pegol are administered by subcutaneous injection. The efficacy and safety of TNF-alpha blockers in Crohn's disease has been reviewed. The authors conclude that the three above-mentioned agents are effective in luminal Crohn's disease. In fistulizing Crohn's disease, TNF-alpha blockers other than infliximab require additional investigation.
Resumo:
INTRODUCTION The objectives were to characterize alveolar fluid clearance (AFC) in pigs with normal lungs and to analyze the effect of immediate application of positive end-expiratory pressure (PEEP). METHODS Animals (n = 25) were mechanically ventilated and divided into four groups: small edema (SE) group, producing pulmonary edema (PE) by intratracheal instillation of 4 ml/kg of saline solution; small edema with PEEP (SE + PEEP) group, same as previous but applying PEEP of 10 cmH2O; large edema (LE) group, producing PE by instillation of 10 ml/kg of saline solution; and large edema with PEEP (LE + PEEP) group, same as LE group but applying PEEP of 10 cmH2O. AFC was estimated from differences in extravascular lung water values obtained by transpulmonary thermodilution method. RESULTS At one hour, AFC was 19.4% in SE group and 18.0% in LE group. In the SE + PEEP group, the AFC rate was higher at one hour than at subsequent time points and higher than in the SE group (45.4% vs. 19.4% at one hour, P < 0.05). The AFC rate was also significantly higher in the LE + PEEP than in the LE group at three hours and four hours. CONCLUSIONS In this pig model, the AFC rate is around 20% at one hour and around 50% at four hours, regardless of the amount of edema, and is increased by the application of PEEP.
Resumo:
The statistical analysis of literary style is the part of stylometry that compares measurable characteristicsin a text that are rarely controlled by the author, with those in other texts. When thegoal is to settle authorship questions, these characteristics should relate to the author’s style andnot to the genre, epoch or editor, and they should be such that their variation between authors islarger than the variation within comparable texts from the same author.For an overview of the literature on stylometry and some of the techniques involved, see for exampleMosteller and Wallace (1964, 82), Herdan (1964), Morton (1978), Holmes (1985), Oakes (1998) orLebart, Salem and Berry (1998).Tirant lo Blanc, a chivalry book, is the main work in catalan literature and it was hailed to be“the best book of its kind in the world” by Cervantes in Don Quixote. Considered by writterslike Vargas Llosa or Damaso Alonso to be the first modern novel in Europe, it has been translatedseveral times into Spanish, Italian and French, with modern English translations by Rosenthal(1996) and La Fontaine (1993). The main body of this book was written between 1460 and 1465,but it was not printed until 1490.There is an intense and long lasting debate around its authorship sprouting from its first edition,where its introduction states that the whole book is the work of Martorell (1413?-1468), while atthe end it is stated that the last one fourth of the book is by Galba (?-1490), after the death ofMartorell. Some of the authors that support the theory of single authorship are Riquer (1990),Chiner (1993) and Badia (1993), while some of those supporting the double authorship are Riquer(1947), Coromines (1956) and Ferrando (1995). For an overview of this debate, see Riquer (1990).Neither of the two candidate authors left any text comparable to the one under study, and thereforediscriminant analysis can not be used to help classify chapters by author. By using sample textsencompassing about ten percent of the book, and looking at word length and at the use of 44conjunctions, prepositions and articles, Ginebra and Cabos (1998) detect heterogeneities that mightindicate the existence of two authors. By analyzing the diversity of the vocabulary, Riba andGinebra (2000) estimates that stylistic boundary to be near chapter 383.Following the lead of the extensive literature, this paper looks into word length, the use of the mostfrequent words and into the use of vowels in each chapter of the book. Given that the featuresselected are categorical, that leads to three contingency tables of ordered rows and therefore tothree sequences of multinomial observations.Section 2 explores these sequences graphically, observing a clear shift in their distribution. Section 3describes the problem of the estimation of a suden change-point in those sequences, in the followingsections we propose various ways to estimate change-points in multinomial sequences; the methodin section 4 involves fitting models for polytomous data, the one in Section 5 fits gamma modelsonto the sequence of Chi-square distances between each row profiles and the average profile, theone in Section 6 fits models onto the sequence of values taken by the first component of thecorrespondence analysis as well as onto sequences of other summary measures like the averageword length. In Section 7 we fit models onto the marginal binomial sequences to identify thefeatures that distinguish the chapters before and after that boundary. Most methods rely heavilyon the use of generalized linear models
Resumo:
BACKGROUND. Bioinformatics is commonly featured as a well assorted list of available web resources. Although diversity of services is positive in general, the proliferation of tools, their dispersion and heterogeneity complicate the integrated exploitation of such data processing capacity. RESULTS. To facilitate the construction of software clients and make integrated use of this variety of tools, we present a modular programmatic application interface (MAPI) that provides the necessary functionality for uniform representation of Web Services metadata descriptors including their management and invocation protocols of the services which they represent. This document describes the main functionality of the framework and how it can be used to facilitate the deployment of new software under a unified structure of bioinformatics Web Services. A notable feature of MAPI is the modular organization of the functionality into different modules associated with specific tasks. This means that only the modules needed for the client have to be installed, and that the module functionality can be extended without the need for re-writing the software client. CONCLUSIONS. The potential utility and versatility of the software library has been demonstrated by the implementation of several currently available clients that cover different aspects of integrated data processing, ranging from service discovery to service invocation with advanced features such as workflows composition and asynchronous services calls to multiple types of Web Services including those registered in repositories (e.g. GRID-based, SOAP, BioMOBY, R-bioconductor, and others).
Resumo:
Introduction The Andalusian Public Health System Virtual Library (Biblioteca Virtual del Sistema Sanitario Público de Andalucía, BV-SSPA) was set up in June 2006. It consists of a regional government action with the aim of democratizing the health professional access to quality scientific information, regardless of the professional workplace. Andalusia is a region with more than 8 million inhabitants, with 100,000 health professionals for 41 hospitals, 1,500 primary healthcare centres, and 28 centres for non-medical attention purposes (research, management, and educational centres). Objectives The Department of Development, Research and Investigation (R+D+i) of the Andalusian Regional Government has, among its duties, the task of evaluating the hospitals and centres of the Andalusian Public Health System (SSPA) in order to distribute its funding. Among the criteria used is the evaluation of the scientific output, which is measured using bibliometry. It is well-known that the bibliometry has a series of limitations and problems that should be taken into account, especially when it is used for non-information sciences, such us career, funding, etc. A few years ago, the bibliometric reports were done separately in each centre, but without using preset and well-defined criteria, elements which are basic when we need to compare the results of the reports. It was possible to find some hospitals which were including Meeting Abstracts in their figures, while others do not, and the same was happening with Erratum and many other differences. Therefore, the main problem that the Department of R+D+i had to deal with, when they were evaluating the health system, was that bibliometric data was not accurate and reports were not comparable. With the aim of having an unified criteria for the whole system, the Department of R+D+i ordered the BV-SSPA to do the year analysis of the scientific output of the system, using some well defined criteria and indicators, among whichstands out the Impact Factor. Materials and Methods As the Impact Factor is the bibliometric indicator that the virtual library is asked to consider, it is necessary to use the database Web of Science (WoS), since it is its owner and editor. The WoS includes the databases Science Citation Index (SCI), Social Sciences Citation Index (SSCI) and Arts & Humanities Citation Index. To gather all the documents, SCI and SSCI are used; to obtain the Impact Factor and quartils, it is used the Journal Citation Reports, JCR. Unlike other bibliographic databases, such us MEDLINE, the bibliometric database WoS includes the address of all the authors. In order to retrieve all the scientific output of the SSPA, we have done general searches, which are afterwards processed by a tool developed by our library. We have done nine different searches using the field ‘address’; eight of them including ‘Spain’ and each one of the eight Andalusian Regions, and the other one combining ‘Spain’ with all those cities where there are health centres, since we have detected that there are some authors that do not use the region in their signatures. These are some of the search strategies: AD=Malaga and AD=Spain AD=Sevill* and AD=Spain AD=SPAIN AND (AD=GUADIX OR AD=BAZA OR AD=MOTRIL) Further more, the field ‘year’ is used to determine the period. To exploit the data, the BV-SSPA has developed a tool called Impactia. It is a web application which uses a database to store the information of the documents generated by the SSPA. Impactia allows the user to automatically process the retrieved documents, assigning them to their correspondent centres. In order to do the classification of documents automaticaly, it was necessary to detect the huge variability of names of the centres that the authors use in their signatures. Therefore, Impactia knows that if an author signs as “Hospital Universitario Virgen Macarena”, “HVM” or “Hosp. Virgin Macarena”, he belongs to the same centre. The figure attached shows the variability found for the Empresa Publica Hospital de Poniente. Besides the documents from WoS, Impactia includes the documents indexed in Scopus and in other databases, where we do bibliographic searches using similar strategies to the later ones. Aware that in the health centres and hospitals there is a lot of grey literature that is not gathered in databases, Impactia allows the centres to feed the application with these documents, so that all the SSPA scientific output is gathered and organised in a centralized place. The ones responsible of localizing this gray literature are the librarians of each one of the centres. They can also do statements to the documents and indicators that are collected and calculated by Impactia. The bulk upload of documents from WoS and Scopus into Impactia is monthly done. One of the main issues that we found during the development of Impactia was the need of dealing with duplicated documents obtained from different sources. Taking into account that sometimes titles might be written differently, with slashes, comas, and so on, Impactia detects the duplicates using the field ‘DOI’ if it is available or comparing the fields: page start, page end and ISSN. Therefore it is possible to guarantee the absence of duplicates. Results The data gathered in Impactia becomes available to the administrative teams and hospitals managers, through an easy web page that allows them to know at any moment, and with just one click, the detailed information of the scientific output of their hospitals, including useful graphs such as percentage of document types, journals where their scientists usually publish, annual comparatives, bibliometric indicators and so on. They can also compare the different centres of the SSPA. Impactia allows the user to download the data from the application, so that he can work with this information or include them in their centres’ reports. This application saves the health system many working hours. It was previously done manually by forty one librarians, while now it is done by only one person in the BV-SSPA during two days a month. To sum up, the benefits of Impactia are: It has shown its effectiveness in the automatic classification, treatment and analysis of the data. It has become an essential tool for all managers to evaluate quickly and easily the scientific production of their centers. It optimizes the human resources of the SSPA, saving time and money. It is the reference point for the Department of R+D+i to do the scientific health staff evaluation.
Resumo:
To guarantee the success of a virtual library is essential that all users can access all the library resources independently of the user's location.
Resumo:
Résumé Lors d'une recherche d'information, l'apprenant est très souvent confronté à des problèmes de guidage et de personnalisation. Ceux-ci sont d'autant plus importants que la recherche se fait dans un environnement ouvert tel que le Web. En effet, dans ce cas, il n'y a actuellement pas de contrôle de pertinence sur les ressources proposées pas plus que sur l'adéquation réelle aux besoins spécifiques de l'apprenant. A travers l'étude de l'état de l'art, nous avons constaté l'absence d'un modèle de référence qui traite des problématiques liées (i) d'une part aux ressources d'apprentissage notamment à l'hétérogénéité de la structure et de la description et à la protection en terme de droits d'auteur et (ii) d'autre part à l'apprenant en tant qu'utilisateur notamment l'acquisition des éléments le caractérisant et la stratégie d'adaptation à lui offrir. Notre objectif est de proposer un système adaptatif à base de ressources d'apprentissage issues d'un environnement à ouverture contrôlée. Celui-ci permet de générer automatiquement sans l'intervention d'un expert pédagogue un parcours d'apprentissage personnalisé à partir de ressources rendues disponibles par le biais de sources de confiance. L'originalité de notre travail réside dans la proposition d'un modèle de référence dit de Lausanne qui est basé sur ce que nous considérons comme étant les meilleures pratiques des communautés : (i) du Web en terme de moyens d'ouverture, (ii) de l'hypermédia adaptatif en terme de stratégie d'adaptation et (iii) de l'apprentissage à distance en terme de manipulation des ressources d'apprentissage. Dans notre modèle, la génération des parcours personnalisés se fait sur la base (i) de ressources d'apprentissage indexées et dont le degré de granularité en favorise le partage et la réutilisation. Les sources de confiance utilisées en garantissent l'utilité et la qualité. (ii) de caractéristiques de l'utilisateur, compatibles avec les standards existants, permettant le passage de l'apprenant d'un environnement à un autre. (iii) d'une adaptation à la fois individuelle et sociale. Pour cela, le modèle de Lausanne propose : (i) d'utiliser ISO/MLR (Metadata for Learning Resources) comme formalisme de description. (ii) de décrire le modèle d'utilisateur avec XUN1 (eXtended User Model), notre proposition d'un modèle compatible avec les standards IEEE/PAPI et IMS/LIP. (iii) d'adapter l'algorithme des fourmis au contexte de l'apprentissage à distance afin de générer des parcours personnalisés. La dimension individuelle est aussi prise en compte par la mise en correspondance de MLR et de XUM. Pour valider notre modèle, nous avons développé une application et testé plusieurs scenarii mettant en action des utilisateurs différents à des moments différents. Nous avons ensuite procédé à des comparaisons entre ce que retourne le système et ce que suggère l'expert. Les résultats s'étant avérés satisfaisants dans la mesure où à chaque fois le système retourne un parcours semblable à celui qu'aurait proposé l'expert, nous sommes confortées dans notre approche.
Resumo:
Using immunocytochemistry and multiunit recording of afferent activity of the whole vestibular nerve, we investigated the role of metabotropic glutamate receptors (mGluR) in the afferent neurotransmission in the frog semicircular canals (SCC). Group I (mGluR1alpha) and group II (mGluR2/3) mGluR immunoreactivities were distributed to the vestibular ganglion neurons, and this can be attributed to a postsynaptic locus of metabotropic regulation of rapid excitatory transmission. The effects of group I/II mGluR agonist (1S,3R)-1-aminocyclopentane-trans-1,3-dicarboxylic acid (ACPD) and antagonist (R,S)-alpha-methyl-4-carboxyphenylglycine (MCPG) on resting and chemically induced afferent activity were studied. ACPD (10-100 microM) enhanced the resting discharge frequency. MCPG (5-100 microM) led to a concentration-dependent decrease of both resting activity and ACPD-induced responses. If the discharge frequency had previously been restored by L-glutamate (L-Glu) in high-Mg2+ solution, ACPD elicited a transient increase in the firing rate in the afferent nerve suggesting that ACPD acts on postsynaptic receptors. The L-Glu agonists, alpha-amino-3-hydroxy-5-methylisoxazole-4-propionate (AMPA) and N-methyl-D-aspartate (NMDA), were tested during application of ACPD. AMPA- and NMDA-induced responses were higher in the presence than absence of ACPD, implicating mGluR in the modulation of ionotropic glutamate receptors. These results indicate that activation of mGluR potentiates AMPA and NMDA responses through a postsynaptic interaction. We conclude that ACPD may exert modulating postsynaptic effects on vestibular afferents and that this process is activity-dependent.
Resumo:
THESIS ABSTRACT : Low-temperature thermochronology relies on application of radioisotopic systems whose closure temperatures are below temperatures at which the dated phases are formed. In that sense, the results are interpreted as "cooling ages" in contrast to "formation ages". Owing to the low closure-temperatures, it is possible to reconstruct exhumation and cooling paths of rocks during their residence at shallow levels of the crust, i.e. within first ~10 km of depth. Processes occurring at these shallow depths such as final exhumation, faulting and relief formation are fundamental for evolution of the mountain belts. This thesis aims at reconstructing the tectono-thermal history of the Aar massif in the Central Swiss Alps by means of zircon (U-Th)/He, apatite (U-Th)/He and apatite fission track thermochronology. The strategy involved acquisition of a large number of samples from a wide range of elevations in the deeply incised Lötschen valley and a nearby NEAT tunnel. This unique location allowed to precisely constrain timing, amount and mechanisms of exhumation of the main orographic feature of the Central Alps, evaluate the role of topography on the thermochronological record and test the impact of hydrothermal activity. Samples were collected from altitudes ranging between 650 and 3930 m and were grouped into five vertical profiles on the surface and one horizontal in the tunnel. Where possible, all three radiometric systems were applied to each sample. Zircon (U-Th)/He ages range from 5.1 to 9.4 Ma and are generally positively correlated with altitude. Age-elevation plots reveal a distinct break in slope, which translates into exhumation rate increasing from ~0.4 to ~3 km/Ma at 6 Ma. This acceleration is independently confirmed by increased cooling rates on the order of 100°C/Ma constrained on the basis of age differences between the zircon (U-Th)/He and the remaining systems. Apatite fission track data also plot on a steep age-elevation curve indicating rapid exhumation until the end of the Miocene. The 6 Ma event is interpreted as reflecting tectonically driven uplift of the Aar massif. The late Miocene timing implies that the increase of precipitation in the Pliocene did not trigger rapid exhumation in the Aar massif. The Messinian salinity crisis in the Mediterranean could not directly intensify erosion of the Aar but associated erosional output from the entire Alps may have tapered the orogenic wedge and caused reactivation of thrusting in the Aar massif. The high exhumation rates in the Messinian were followed by a decrease to ~1.3 km/Ma as evidenced by ~8 km of exhumation during last 6 Ma. The slowing of exhumation is also apparent from apatite (U-Th)1He age-elevation data in the northern part of the Lötschen valley where they plot on a ~0.5km/Ma line and range from 2.4 to 6.4 Ma However, from the apatite (U-Th)/He and fission track data from the NEAT tunnel, there is an indication of a perturbation of the record. The apatite ages are youngest under the axis of the valley, in contrast to an expected pattern where they would be youngest in the deepest sections of the tunnel due to heat advection into ridges. The valley however, developed in relatively soft schists while the ridges are built of solid granitoids. In line with hydrological observations from the tunnel, we suggest that the relatively permeable rocks under the valley floor, served as conduits of geothermal fluids that caused reheating leading to partial Helium loss and fission track annealing in apatites. In consequence, apatite ages from the lowermost samples are too young and the calculated exhumation rates may underestimate true values. This study demonstrated that high-density sampling is indispensable to provide meaningful thermochronological data in the Alpine setting. The multi-system approach allows verifying plausibility of the data and highlighting sources of perturbation. RÉSUMÉ DE THÈSE : La thermochronologie de basse température dépend de l'utilisation de systèmes radiométriques dont la température de fermeture est nettement inférieure à la température de cristallisation du minéral. Les résultats obtenus sont par conséquent interprétés comme des âges de refroidissement qui diffèrent des âges de formation obtenus par le biais d'autres systèmes de datation. Grâce aux températures de refroidissement basses, il est aisé de reconstruire les chemins de refroidissement et d'exhumation des roches lors de leur résidence dans la croute superficielle (jusqu'à 10 km). Les processus qui entrent en jeu à ces faibles profondeurs tels que l'exhumation finale, la fracturation et le faillage ainsi que la formation du relief sont fondamentaux dans l'évolution des chaînes de montagne. Ces dernières années, il est devenu clair que l'enregistrement thermochronologique dans les orogènes peut être influencé par le relief et réinitialisé par l'advection de la chaleur liée à la circulation de fluides géothermaux après le refroidissement initial. L'objectif de cette thèse est de reconstruire l'histoire tectono-thermique du massif de l'Aar dans les Alpes suisses Centrales à l'aide de trois thermochronomètres; (U-Th)/He sur zircon, (U-Th)/He sur apatite et les traces de fission sur apatite. Afin d'atteindre cet objectif, nous avons récolté un grand nombre d'échantillons provenant de différentes altitudes dans la vallée fortement incisée de Lötschental ainsi que du tunnel de NEAT. Cette stratégie d'échantillonnage nous a permis de contraindre de manière précise la chronologie, les quantités et les mécanismes d'exhumation de cette zone des Alpes Centrales, d'évaluer le rôle de la topographie sur l'enregistrement thermochronologique et de tester l'impact de l'hydrothermalisme sur les géochronomètres. Les échantillons ont été prélevés à des altitudes comprises entre 650 et 3930m selon 5 profils verticaux en surface et un dans le tunnel. Quand cela à été possible, les trois systèmes radiométriques ont été appliqués aux échantillons. Les âges (U-Th)\He obtenus sur zircons sont compris entre 5.l et 9.4 Ma et sont corrélés de manière positive avec l'altitude. Les graphiques représentant l'âge et l'élévation montrent une nette rupture de la pente qui traduisent un accroissement de la vitesse d'exhumation de 0.4 à 3 km\Ma il y a 6 Ma. Cette accélération de l'exhumation est confirmée par les vitesses de refroidissement de l'ordre de 100°C\Ma obtenus à partir des différents âges sur zircons et à partir des autres systèmes géochronologiques. Les données obtenues par traces de fission sur apatite nous indiquent également une exhumation rapide jusqu'à la fin du Miocène. Nous interprétons cet évènement à 6 Ma comme étant lié à l'uplift tectonique du massif de l'Aar. Le fait que cet évènement soit tardi-miocène implique qu'une augmentation des précipitations au Pliocène n'a pas engendré cette exhumation rapide du massif de l'Aar. La crise Messinienne de la mer méditerranée n'a pas pu avoir une incidence directe sur l'érosion du massif de l'Aar mais l'érosion associée à ce phénomène à pu réduire le coin orogénique alpin et causer la réactivation des chevauchements du massif de l'Aar. L'exhumation rapide Miocène a été suivie pas une diminution des taux d'exhumation lors des derniers 6 Ma (jusqu'à 1.3 km\Ma). Cependant, les âges (U-Th)\He sur apatite ainsi que les traces de fission sur apatite des échantillons du tunnel enregistrent une perturbation de l'enregistrement décrit ci-dessus. Les âges obtenus sur les apatites sont sensiblement plus jeunes sous l'axe de la vallée en comparaison du profil d'âges attendus. En effet, on attendrait des âges plus jeunes sous les parties les plus profondes du tunnel à cause de l'advection de la chaleur dans les flancs de la vallée. La vallée est creusée dans des schistes alors que les flancs de celle-ci sont constitués de granitoïdes plus durs. En accord avec les observations hydrologiques du tunnel, nous suggérons que la perméabilité élevée des roches sous l'axe de la vallée à permi l'infiltration de fluides géothermaux qui a généré un réchauffement des roches. Ce réchauffement aurait donc induit une perte d'Hélium et un recuit des traces de fission dans les apatites. Ceci résulterait en un rajeunissement des âges apatite et en une sous-estimation des vitesses d'exhumation sous l'axe de la vallée. Cette étude à servi à démontrer la nécessité d'un échantillonnage fin et précis afin d'apporter des données thermochronologiques de qualité dans le contexte alpin. Cette approche multi-système nous a permi de contrôler la pertinence des données acquises ainsi que d'identifier les sources possibles d'erreurs lors d'études thermochronologiques. RÉSUMÉ LARGE PUBLIC Lors d'une orogenèse, les roches subissent un cycle comprenant une subduction, de la déformation, du métamorphisme et, finalement, un retour à la surface (ou exhumation). L'exhumation résulte de la déformation au sein de la zone de collision, menant à un raccourcissement et un apaissessement de l'édifice rocheux, qui se traduit par une remontée des roches, création d'une topographie et érosion. Puisque l'érosion agit comme un racloir sur la partie supérieure de l'édifice, des tentatives de corrélation entre les épisodes d'exhumation rapide et les périodes d'érosion intensive, dues aux changements climatiques, ont été effectuées. La connaissance de la chronologie et du lieu précis est d'une importance capitale pour une quelconque reconstruction de l'évolution d'une chaîne de montagne. Ces critères sont donnés par un retraçage des changements de la température de la roche en fonction du temps, nous donnant le taux de refroidissement. L'instant auquel les roches ont refroidit, passant une certaine température, est contraint par l'application de techniques de datation par radiométrie. Ces méthodes reposent sur la désintégration des isotopes radiogéniques, tels que l'uranium et le potassium, tous deux abondants dans les roches de la croûte terrestre. Les produits de cette désintégration ne sont pas retenus dans les minéraux hôtes jusqu'au moment du refroidissement de la roche sous une température appelée 'de fermeture' , spécifique à chaque système de datation. Par exemple, la désintégration radioactive des atomes d'uranium et de thorium produit des atomes d'hélium qui s'échappent d'un cristal de zircon à des températures supérieures à 200°C. En mesurant la teneur en uranium-parent, l'hélium accumulé et en connaissant le taux de désintégration, il est possible de calculer à quel moment la roche échantillonnée est passée sous la température de 200°C. Si le gradient géothermal est connu, les températures de fermeture peuvent être converties en profondeurs actuelles (p. ex. 200°C ≈ 7km), et le taux de refroidissement en taux d'exhumation. De plus, en datant par système radiométrique des échantillons espacés verticalement, il est possible de contraindre directement le taux d'exhumation de la section échantillonnée en observant les différences d'âges entre des échantillons voisins. Dans les Alpes suisses, le massif de l'Aar forme une structure orographique majeure. Avec des altitudes supérieures à 4000m et un relief spectaculaire de plus de 2000m, le massif domine la partie centrale de la chaîne de montagne. Les roches aujourd'hui exposées à la surface ont été enfouies à plus de 10 km de profond il y a 20 Ma, mais la topographie actuelle du massif de l'Aar semble surtout s'être développée par un soulèvement actif depuis quelques millions d'années, c'est-à-dire depuis le Néogène supérieur. Cette période comprend un changement climatique soudain ayant touché l'Europe il y a environ 5 Ma et qui a occasionné de fortes précipitations, entraînant certainement une augmentation de l'érosion et accélérant l'exhumation des Alpes. Dans cette étude, nous avons employé le système de datation (U-TH)/He sur zircon, dont la température de fermeture de 200°C est suffisamment basse pour caractériser l'exhumation du Néogène sup. /Pliocène. Les échantillons proviennent du Lötschental et du tunnel ferroviaire le plus profond du monde (NEAT) situé dans la partie ouest du massif de l'Aar. Considérés dans l'ensemble, ces échantillons se répartissent sur un dénivelé de 3000m et des âges de 5.1 à 9.4 Ma. Les échantillons d'altitude supérieure (et donc plus vieux) documentent un taux d'exhumation de 0.4 km/Ma jusqu'à il y a 6 Ma, alors que les échantillons situés les plus bas ont des âges similaires allant de 6 à 5.4 Ma, donnant un taux jusqu'à 3km /Ma. Ces données montrent une accélération dramatique de l'exhumation du massif de l'Aar il y a 6 Ma. L'exhumation miocène sup. du massif prédate donc le changement climatique Pliocène. Cependant, lors de la crise de salinité d'il y a 6-5.3 Ma (Messinien), le niveau de la mer Méditerranée est descendu de 3km. Un tel abaissement de la surface d'érosion peut avoir accéléré l'exhumation des Alpes, mais le bassin sud alpin était trop loin du massif de l'Aar pour influencer son érosion. Nous arrivons à la conclusion que la datation (U-Th)/He permet de contraindre précisément la chronologie et l'exhumation du massif de l'Aar. Concernant la dualité tectonique-érosion, nous suggérons que, dans le cas du massif de l'Aar, la tectonique prédomine.
Resumo:
El presente documento introduce a las pequeñas y medianas empresas en el mundo de la virtualización y el cloud computing. Partiendo de la presentación de ambas tecnologías, se recorren las diferentes fases por las que atraviesa un proyecto tecnológico consistente en la instalación de una plataforma virtualizada que alberga los sistemas informáticos básicos en una PYME.
Resumo:
BACKGROUND: The yeast Schizosaccharomyces pombe is frequently used as a model for studying the cell cycle. The cells are rod-shaped and divide by medial fission. The process of cell division, or cytokinesis, is controlled by a network of signaling proteins called the Septation Initiation Network (SIN); SIN proteins associate with the SPBs during nuclear division (mitosis). Some SIN proteins associate with both SPBs early in mitosis, and then display strongly asymmetric signal intensity at the SPBs in late mitosis, just before cytokinesis. This asymmetry is thought to be important for correct regulation of SIN signaling, and coordination of cytokinesis and mitosis. In order to study the dynamics of organelles or large protein complexes such as the spindle pole body (SPB), which have been labeled with a fluorescent protein tag in living cells, a number of the image analysis problems must be solved; the cell outline must be detected automatically, and the position and signal intensity associated with the structures of interest within the cell must be determined. RESULTS: We present a new 2D and 3D image analysis system that permits versatile and robust analysis of motile, fluorescently labeled structures in rod-shaped cells. We have designed an image analysis system that we have implemented as a user-friendly software package allowing the fast and robust image-analysis of large numbers of rod-shaped cells. We have developed new robust algorithms, which we combined with existing methodologies to facilitate fast and accurate analysis. Our software permits the detection and segmentation of rod-shaped cells in either static or dynamic (i.e. time lapse) multi-channel images. It enables tracking of two structures (for example SPBs) in two different image channels. For 2D or 3D static images, the locations of the structures are identified, and then intensity values are extracted together with several quantitative parameters, such as length, width, cell orientation, background fluorescence and the distance between the structures of interest. Furthermore, two kinds of kymographs of the tracked structures can be established, one representing the migration with respect to their relative position, the other representing their individual trajectories inside the cell. This software package, called "RodCellJ", allowed us to analyze a large number of S. pombe cells to understand the rules that govern SIN protein asymmetry. CONCLUSIONS: "RodCell" is freely available to the community as a package of several ImageJ plugins to simultaneously analyze the behavior of a large number of rod-shaped cells in an extensive manner. The integration of different image-processing techniques in a single package, as well as the development of novel algorithms does not only allow to speed up the analysis with respect to the usage of existing tools, but also accounts for higher accuracy. Its utility was demonstrated on both 2D and 3D static and dynamic images to study the septation initiation network of the yeast Schizosaccharomyces pombe. More generally, it can be used in any kind of biological context where fluorescent-protein labeled structures need to be analyzed in rod-shaped cells. AVAILABILITY: RodCellJ is freely available under http://bigwww.epfl.ch/algorithms.html, (after acceptance of the publication).
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This project explores the user costs and benefits of winter road closures. Severe winter weather makes travel unsafe and dramatically increases crash rates. When conditions become unsafe due to winter weather, road closures should allow users to avoid crash costs and eliminate costs associated with rescuing stranded motorists. Therefore, the benefits of road closures are the avoided safety costs. The costs of road closures are the delays that are imposed on motorists and motor carriers who would have made the trip had the road not been closed. This project investigated the costs and benefits of road closures and found that evaluating the benefits and costs is not as simple as it appears. To better understand the costs and benefits of road closures, the project investigates the literature, conducts interviews with shippers and motor carriers, and conducts case studies of road closures to determine what actually occurred on roadways during closures. The project also estimates a statistical model that relates weather severity to crash rates. Although, the statistical model is intended to illustrate the possibility to quantitatively relate measurable and predictable weather conditions to the safety performance of a roadway. In the future, weather conditions such as snow fall intensity, visibility, etc., can be used to make objective measures of the safety performance of a roadway rather than relying on subjective evaluations of field staff. The review of the literature and the interviews clearly illustrate that not all delays (increased travel time) are valued the same. Expected delays (routine delays) are valued at the generalized costs (value of the driver’s time, fuel, insurance, wear and tear on the vehicle, etc.), but unexpected delays are valued much higher because they result in interruption of synchronous activities at the trip’s destination. To reduce the costs of delays resulting from road closures, public agencies should communicate as early as possible the likelihood of a road closure.
Resumo:
A short course designed for adult study groups Prepared by the Curriculum Task Force of the End-of-Life Care Coalition of Central Iowa