966 resultados para Precision


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study was initiated with the aim to assess the in vivo electrochemical corrosion behaviour of CoCrMo biomedical alloys in human synovial fluids in an attempt to identify possible patient or pathology specific effects. For this, electrochemical measurements (open circuit potential OCP, polarization resistance Rp, potentiodynamic polarization curves, electrochemical impedance spectroscopy EIS) were carried out on fluids extracted from patients with different articular pathologies and prosthesis revisions. Those electrochemical measurements could be carried out with outstanding precision and signal stability. The results show that the corrosion behaviour of CoCrMo alloy in synovial fluids not only depends on material reactivity but also on the specific reactions of synovial fluid components, most likely involving reactive oxygen species. In some patients the latter were found to determine the whole cathodic and anodic electrochemical response. Depending on patients, corrosion rates varied significantly between 50 and 750mgdm(-2)year(-1).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated fingermark residues using Fourier transform infrared microscopy (μ- FTIR) in order to obtain fundamental information about the marks' initial composition and aging kinetics. This knowledge would be an asset for fundamental research on fingermarks, such as for dating purposes. Attenuated Total Reflection (ATR) and single-point reflection modes were tested on fresh fingermarks. ATR proved to be better suited and this mode was subsequently selected for further aging studies. Eccrine and sebaceous material was found in fresh and aged fingermarks and the spectral regions 1000-1850 cm-1 and 2700-3600 cm-1 were identified as the most informative. The impact of substrates (aluminium and glass slides) and storage conditions (storage in the light and in the dark) on fingermark aging was also studied. Chemometric analyses showed that fingermarks could be grouped according to their age regardless of the substrate when they were stored in an open box kept in an air-conditioned laboratory at around 20°C next to a window. On the contrary, when fingermarks were stored in the dark, only specimens deposited on the same substrate could be grouped by age. Thus, the substrate appeared to influence aging of fingermarks in the dark. Furthermore, PLS regression analyses were conducted in order to study the possibility of modelling fingermark aging for potential fingermark dating applications. The resulting models showed an overall precision of ±3 days and clearly demonstrated their capability to differentiate older fingermarks (20 and 34-days old) from newer ones (1, 3, 7 and 9-days old) regardless of the substrate and lighting conditions. These results are promising from a fingermark dating perspective. Further research is required to fully validate such models and assess their robustness and limitations in uncontrolled casework conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the research is to define practical profit which can be achieved using neural network methods as a prediction instrument. The thesis investigates the ability of neural networks to forecast future events. This capability is checked on the example of price prediction during intraday trading on stock market. The executed experiments show predictions of average 1, 2, 5 and 10 minutes’ prices based on data of one day and made by two different types of forecasting systems. These systems are based on the recurrent neural networks and back propagation neural nets. The precision of the predictions is controlled by the absolute error and the error of market direction. The economical effectiveness is estimated by a special trading system. In conclusion, the best structures of neural nets are tested with data of 31 days’ interval. The best results of the average percent of profit from one transaction (buying + selling) are 0.06668654, 0.188299453, 0.349854787 and 0.453178626, they were achieved for prediction periods 1, 2, 5 and 10 minutes. The investigation can be interesting for the investors who have access to a fast information channel with a possibility of every-minute data refreshment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn teoriaosuudessa tutustutaan ensin paikkatiedon käsitteeseen ja paikkatietoa hyödyntäviin palveluihin. Lisäksi perehdytään paikannukseen langattomissa lähiverkoissa ja erityisesti paikannukseen tämän diplomityön osalta käytettävässä verkossa. Työn teoriaosuudessa tutustutaan myös paikkatietoa hyödyntävien palveluiden hyöty- sekä haittanäkökulmiin. Teoriaosuudessa käydään myös läpi tällä hetkellä yleisimmät pikaviestintäarkkitehtuurit ja tutustutaan tarkemmin Jabber–pikaviestintäohjelmiston käyttämään protokollaan. Lopuksi tarkastellaan paikkatiedon hyödyntämiseen liittyviä lakiteknisiä seikkoja ja henkilön yksityisyyden suojaa. Diplomityön käytännön osuudessa tutustutaan paikkatietoa hyödyntävän palvelinkomponentin toteutukseen Jabber–arkkitehtuuria hyväksikäyttäen. Jabber-palvelinohjelmisto ja tehty komponentti toimivat langattomassa lähiverkossa (WLPR.NET), jota ylläpitää Lappeenrannan teknillisen yliopiston tietoliikennetekniikan laitos. Verkon käyttäjät voivat rekisteröityä palvelun käyttäjiksi, jonka jälkeen palvelinkomponentti pitää kirjaa rekisteröityneiden käyttäjien paikkatiedosta ja sen muutoksista. Lisäksi käyttäjät voivat hakea muiden käyttäjien paikkatietoa asiakasohjelmistossa toimivan hakutoiminnon avulla. Käyttäjien paikkatieto saadaan käyttämällä jo olemassa olevaa tekniikkaa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämä työ esittelee uuden tarjota paikasta riippuvaa tietoa langattomien tietoverkkojen käyttäjille. Tieto välitetään jokaiselle käyttäjälle tietämättä mitään käyttäjän henkilöllisyydestä. Sovellustason protokollaksi valittiin HTTP, joka mahdollistaa tämän järjestelmän saattaa tietoa perille useimmille käyttäjille, jotka käyttävät hyvinkin erilaisia päätelaitteita. Tämä järjestelmä toimii sieppaavan www-liikenteen välityspalvelimen jatkeena. Erilaisten tietokantojen sisällä on perusteella järjestelmä päättää välitetäänkö tietoa vai ei. Järjestelmä sisältää myös yksinkertaisen ohjelmiston käyttäjien paikantamiseksi yksittäisen tukiaseman tarkkuudella. Vaikka esitetty ratkaisu tähtääkin paikkaan perustuvien mainosten tarjoamiseen, se on helposti muunnettavissa minkä tahansa tyyppisen tiedon välittämiseen käyttäjille.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Malgrat que la resistència específica del jugador té una influència directa sobre el rendiment en el tennis, un esport intermitent de llarga durada, les proves utilitzades per valorar-la no solen incloure tasques motrius properes a situacions de joc reals i poden ser considerades de baixa especificitat. L’objectiu d’aquest estudi és desenvolupar una prova de camp de valoració de la resistència específica en tennis (Specific Endurance Tennis Test, SET-Test), analitzant el comportament de la freqüència cardíaca (FC) i de paràmetres d’efectivitat tècnica (ET), per tal d’esbrinar una possible relació entre ambdós paràmetres i d’aquests amb el rendiment esportiu en jugadors de competició. Van participar set tennistes masculins, als quals els va ser administrada una prova triangular, progressiva, contínua i d’intensitat màxima conduïda per una màquina llançapilotes, durant la qual es va registrar la FC i, al mateix temps, paràmetres objectius d’ET (precisió i potència) mitjançant el càlcul de percentatge d’encerts i errors. S’observa un punt de deflexió de la FC (PDFC) en un 86 % dels subjectes estudiats, previ o coincident amb una disminució de l’ET (punt de deflexió de l’eficiència tècnica, PDET). Aquests dos punts mesurats de forma simultània al llarg de la prova es mostren relacionats amb el rendiment competitiu dels jugadors estudiats. Es conclou que la prova proposada sembla un mètode específic i vàlid per avaluar la resistència específica i la condició aeròbica en tennistes, tot i que calen més estudis per tal de confirmar les hipòtesis plantejades i la validesa externa de la prova.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The neutron skin thickness of nuclei is a sensitive probe of the nuclear symmetry energy and has multiple implications for nuclear and astrophysical studies. However, precision measurements of this observable are difficult to obtain. The analysis of the experimental data may imply some assumptions about the bulk or surface nature of the formation of the neutron skin. Here we study the bulk or surface character of neutron skins of nuclei following from calculations with Gogny, Skyrme, and covariant nuclear mean-field interactions. These interactions are successful in describing nuclear charge radii and binding energies but predict different values for neutron skins. We perform the study by fitting two-parameter Fermi distributions to the calculated self-consistent neutron and proton densities. We note that the equivalent sharp radius is a more suitable reference quantity than the half-density radius parameter of the Fermi distributions to discern between the bulk and surface contributions in neutron skins. We present calculations for nuclei in the stability valley and for the isotopic chains of Sn and Pb.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work methods for the multiresidue determination of the series of quinolones include in the European regulation in food of animal origin are de veloped and validated in line with Commission Decision 2002/657/EC in terms of linearity, decision limit, capability detection, precision and stability. Mult iresidue methods were established to allow the determination of quinolones covered by EU legislation in 2377/90/EC in muscle of chicken, turkey, pig and cow, plasma of cow and pig, liver of pig and milk of cow. First an extraction step was optimized and a SPE step was applied to clean!up and preconcentrate quinolones prior to their separation by CE or LC and determination by CE!UV, LC!UV, LC!Fl, LC!MS with different ion sources (ESI ,ApCI) and different mass analyser (Q, ToF) and LC!E SI!QqQ tandem mass spectrometry. The limits of quantification obtained are always lower than Maxim um Residue Limit (MRL) established by EU for quinolones in animal products and they can be applied to the control of quinolones in foodstuffs of animal origin . Finally the proposed methods were applied to determine quinolones in samples of turkey and pig muscle, pig plasma and milk of cow. Excellent quality parameters and reduced time of analysis were obtained when LC!ESI!MS/MS is used, although the others techniques presented too satisfactory results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study shows how a new generation of terrestrial laser scanners can be used to investigate glacier surface ablation and other elements of glacial hydrodynamics at exceptionally high spatial and temporal resolution. The study area is an Alpine valley glacier, Haut Glacier d'Arolla, Switzerland. Here we use an ultra-long-range lidar RIEGL VZ-6000 scanner, having a laser specifically designed for measurement of snow- and ice-cover surfaces. We focus on two timescales: seasonal and daily. Our results show that a near-infrared scanning laser system can provide high-precision elevation change and ablation data from long ranges, and over relatively large sections of the glacier surface. We use it to quantify spatial variations in the patterns of surface melt at the seasonal scale, as controlled by both aspect and differential debris cover. At the daily scale, we quantify the effects of ogive-related differences in ice surface debris content on spatial patterns of ablation. Daily scale measurements point to possible hydraulic jacking of the glacier associated with short-term water pressure rises. This latter demonstration shows that this type of lidar may be used to address subglacial hydrologic questions, in addition to motion and ablation measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A precise determination of the neutron skin thickness of a heavy nucleus sets a basic constraint on the nuclear symmetry energy (the neutron skin thickness is the difference of the neutron and proton rms radii of the nucleus). The parity radius experiment (PREX) may achieve it by electroweak parity-violating electron scattering (PVES) on 208Pb. We investigate PVES in nuclear mean field approach to allow the accurate extraction of the neutron skin thickness of 208Pb from the parity-violating asymmetry probed in the experiment. We demonstrate a high linear correlation between the parity-violating asymmetry and the neutron skin thickness in successful mean field forces as the best means to constrain the neutron skin of 208Pb from PREX, without assumptions on the neutron density shape. Continuation of the experiment with higher precision in the parity-violating asymmetry is motivated since the present method can support it to constrain the density slope of the nuclear symmetry energy to new accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study proposes a method based on ski fixed inertial sensors to automatically compute spatio-temporal parameters (phase durations, cycle speed and cycle length) for the diagonal stride in classical cross-country skiing. The proposed system was validated against a marker-based motion capture system during indoor treadmill skiing. Skiing movement of 10 junior to world-cup athletes was measured for four different conditions. The accuracy (i.e. median error) and precision (i.e. interquartile range of error) of the system was below 6ms for cycle duration and ski thrust duration and below 35ms for pole push duration. Cycle speed precision (accuracy) was below 0.1m/s (0.005m/s) and cycle length precision (accuracy) was below 0.15m (0.005m). The system was sensitive to changes of conditions and was accurate enough to detect significant differences reported in previous studies. Since capture volume is not limited and setup is simple, the system would be well suited for outdoor measurements on snow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ancient temple dedicated to the Roman Emperor Augustus on the hilltop of Tarraco (today’s Tarragona), was the main element of the sacred precinct of the Imperial cult. It was a two hectare square, bordered by a portico with an attic decorated with a sequence of clypeus (i.e. monumental shields) made with marble plates from the Luni-Carrara’s quarries. This contribution presents the results of the analysis of a three-dimensional photogrammetric survey of one of these clipeus, partially restored and exhibited at the National Archaeological Museum of Tarragona. The perimeter ring was bounded by a sequence of meanders inscribed in a polygon of 11 sides, a hendecagon. Moreover, a closer geometric analysis suggests that the relationship between the outer meander rim and the oval pearl ring that delimited the divinity of Jupiter Ammon can be accurately determined by the diagonals of an octagon inscribed in the perimeter of the clypeus. This double evidence suggests a combined layout, in the same design, of an octagon and a hendecagon. Hypothetically, this could be achieved by combining the octagon with the approximation to Pi used in antiquity: 22/7 of the circle’s diameter. This method allows the drawing of a hendecagon with a clearly higher precision than with other ancient methods. Even the modelling of the motifs that separate the different decorative stripes corroborates the geometric scheme that we propose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Owing to recent advances in genomic technologies, personalized oncology is poised to fundamentally alter cancer therapy. In this paradigm, the mutational and transcriptional profiles of tumors are assessed, and personalized treatments are designed based on the specific molecular abnormalities relevant to each patient's cancer. To date, such approaches have yielded impressive clinical responses in some patients. However, a major limitation of this strategy has also been revealed: the vast majority of tumor mutations are not targetable by current pharmacological approaches. Immunotherapy offers a promising alternative to exploit tumor mutations as targets for clinical intervention. Mutated proteins can give rise to novel antigens (called neoantigens) that are recognized with high specificity by patient T cells. Indeed, neoantigen-specific T cells have been shown to underlie clinical responses to many standard treatments and immunotherapeutic interventions. Moreover, studies in mouse models targeting neoantigens, and early results from clinical trials, have established proof of concept for personalized immunotherapies targeting next-generation sequencing identified neoantigens. Here, we review basic immunological principles related to T-cell recognition of neoantigens, and we examine recent studies that use genomic data to design personalized immunotherapies. We discuss the opportunities and challenges that lie ahead on the road to improving patient outcomes by incorporating immunotherapy into the paradigm of personalized oncology.