922 resultados para Elements, Elettrofisiologia, Acquisizione Real Time, Analisi Real Time, High Throughput Data
Resumo:
Negli ultimi anni, un crescente numero di studiosi ha focalizzato la propria attenzione sullo sviluppo di strategie che permettessero di caratterizzare le proprietà ADMET dei farmaci in via di sviluppo, il più rapidamente possibile. Questa tendenza origina dalla consapevolezza che circa la metà dei farmaci in via di sviluppo non viene commercializzato perché ha carenze nelle caratteristiche ADME, e che almeno la metà delle molecole che riescono ad essere commercializzate, hanno comunque qualche problema tossicologico o ADME [1]. Infatti, poco importa quanto una molecola possa essere attiva o specifica: perché possa diventare farmaco è necessario che venga ben assorbita, distribuita nell’organismo, metabolizzata non troppo rapidamente, ne troppo lentamente e completamente eliminata. Inoltre la molecola e i suoi metaboliti non dovrebbero essere tossici per l’organismo. Quindi è chiaro come una rapida determinazione dei parametri ADMET in fasi precoci dello sviluppo del farmaco, consenta di risparmiare tempo e denaro, permettendo di selezionare da subito i composti più promettenti e di lasciar perdere quelli con caratteristiche negative. Questa tesi si colloca in questo contesto, e mostra l’applicazione di una tecnica semplice, la biocromatografia, per caratterizzare rapidamente il legame di librerie di composti alla sieroalbumina umana (HSA). Inoltre mostra l’utilizzo di un’altra tecnica indipendente, il dicroismo circolare, che permette di studiare gli stessi sistemi farmaco-proteina, in soluzione, dando informazioni supplementari riguardo alla stereochimica del processo di legame. La HSA è la proteina più abbondante presente nel sangue. Questa proteina funziona da carrier per un gran numero di molecole, sia endogene, come ad esempio bilirubina, tiroxina, ormoni steroidei, acidi grassi, che xenobiotici. Inoltre aumenta la solubilità di molecole lipofile poco solubili in ambiente acquoso, come ad esempio i tassani. Il legame alla HSA è generalmente stereoselettivo e ad avviene a livello di siti di legame ad alta affinità. Inoltre è ben noto che la competizione tra farmaci o tra un farmaco e metaboliti endogeni, possa variare in maniera significativa la loro frazione libera, modificandone l’attività e la tossicità. Per queste sue proprietà la HSA può influenzare sia le proprietà farmacocinetiche che farmacodinamiche dei farmaci. Non è inusuale che un intero progetto di sviluppo di un farmaco possa venire abbandonato a causa di un’affinità troppo elevata alla HSA, o a un tempo di emivita troppo corto, o a una scarsa distribuzione dovuta ad un debole legame alla HSA. Dal punto di vista farmacocinetico, quindi, la HSA è la proteina di trasporto del plasma più importante. Un gran numero di pubblicazioni dimostra l’affidabilità della tecnica biocromatografica nello studio dei fenomeni di bioriconoscimento tra proteine e piccole molecole [2-6]. Il mio lavoro si è focalizzato principalmente sull’uso della biocromatografia come metodo per valutare le caratteristiche di legame di alcune serie di composti di interesse farmaceutico alla HSA, e sul miglioramento di tale tecnica. Per ottenere una miglior comprensione dei meccanismi di legame delle molecole studiate, gli stessi sistemi farmaco-HSA sono stati studiati anche con il dicroismo circolare (CD). Inizialmente, la HSA è stata immobilizzata su una colonna di silice epossidica impaccata 50 x 4.6 mm di diametro interno, utilizzando una procedura precedentemente riportata in letteratura [7], con alcune piccole modifiche. In breve, l’immobilizzazione è stata effettuata ponendo a ricircolo, attraverso una colonna precedentemente impaccata, una soluzione di HSA in determinate condizioni di pH e forza ionica. La colonna è stata quindi caratterizzata per quanto riguarda la quantità di proteina correttamente immobilizzata, attraverso l’analisi frontale di L-triptofano [8]. Di seguito, sono stati iniettati in colonna alcune soluzioni raceme di molecole note legare la HSA in maniera enantioselettiva, per controllare che la procedura di immobilizzazione non avesse modificato le proprietà di legame della proteina. Dopo essere stata caratterizzata, la colonna è stata utilizzata per determinare la percentuale di legame di una piccola serie di inibitori della proteasi HIV (IPs), e per individuarne il sito(i) di legame. La percentuale di legame è stata calcolata attraverso il fattore di capacità (k) dei campioni. Questo parametro in fase acquosa è stato estrapolato linearmente dal grafico log k contro la percentuale (v/v) di 1-propanolo presente nella fase mobile. Solamente per due dei cinque composti analizzati è stato possibile misurare direttamente il valore di k in assenza di solvente organico. Tutti gli IPs analizzati hanno mostrato un’elevata percentuale di legame alla HSA: in particolare, il valore per ritonavir, lopinavir e saquinavir è risultato maggiore del 95%. Questi risultati sono in accordo con dati presenti in letteratura, ottenuti attraverso il biosensore ottico [9]. Inoltre, questi risultati sono coerenti con la significativa riduzione di attività inibitoria di questi composti osservata in presenza di HSA. Questa riduzione sembra essere maggiore per i composti che legano maggiormente la proteina [10]. Successivamente sono stati eseguiti degli studi di competizione tramite cromatografia zonale. Questo metodo prevede di utilizzare una soluzione a concentrazione nota di un competitore come fase mobile, mentre piccole quantità di analita vengono iniettate nella colonna funzionalizzata con HSA. I competitori sono stati selezionati in base al loro legame selettivo ad uno dei principali siti di legame sulla proteina. In particolare, sono stati utilizzati salicilato di sodio, ibuprofene e valproato di sodio come marker dei siti I, II e sito della bilirubina, rispettivamente. Questi studi hanno mostrato un legame indipendente dei PIs ai siti I e II, mentre è stata osservata una debole anticooperatività per il sito della bilirubina. Lo stesso sistema farmaco-proteina è stato infine investigato in soluzione attraverso l’uso del dicroismo circolare. In particolare, è stato monitorata la variazione del segnale CD indotto di un complesso equimolare [HSA]/[bilirubina], a seguito dell’aggiunta di aliquote di ritonavir, scelto come rappresentante della serie. I risultati confermano la lieve anticooperatività per il sito della bilirubina osservato precedentemente negli studi biocromatografici. Successivamente, lo stesso protocollo descritto precedentemente è stato applicato a una colonna di silice epossidica monolitica 50 x 4.6 mm, per valutare l’affidabilità del supporto monolitico per applicazioni biocromatografiche. Il supporto monolitico monolitico ha mostrato buone caratteristiche cromatografiche in termini di contropressione, efficienza e stabilità, oltre che affidabilità nella determinazione dei parametri di legame alla HSA. Questa colonna è stata utilizzata per la determinazione della percentuale di legame alla HSA di una serie di poliamminochinoni sviluppati nell’ambito di una ricerca sulla malattia di Alzheimer. Tutti i composti hanno mostrato una percentuale di legame superiore al 95%. Inoltre, è stata osservata una correlazione tra percentuale di legame è caratteristiche della catena laterale (lunghezza e numero di gruppi amminici). Successivamente sono stati effettuati studi di competizione dei composti in esame tramite il dicroismo circolare in cui è stato evidenziato un effetto anticooperativo dei poliamminochinoni ai siti I e II, mentre rispetto al sito della bilirubina il legame si è dimostrato indipendente. Le conoscenze acquisite con il supporto monolitico precedentemente descritto, sono state applicate a una colonna di silice epossidica più corta (10 x 4.6 mm). Il metodo di determinazione della percentuale di legame utilizzato negli studi precedenti si basa su dati ottenuti con più esperimenti, quindi è necessario molto tempo prima di ottenere il dato finale. L’uso di una colonna più corta permette di ridurre i tempi di ritenzione degli analiti, per cui la determinazione della percentuale di legame alla HSA diventa molto più rapida. Si passa quindi da una analisi a medio rendimento a una analisi di screening ad alto rendimento (highthroughput- screening, HTS). Inoltre, la riduzione dei tempi di analisi, permette di evitare l’uso di soventi organici nella fase mobile. Dopo aver caratterizzato la colonna da 10 mm con lo stesso metodo precedentemente descritto per le altre colonne, sono stati iniettati una serie di standard variando il flusso della fase mobile, per valutare la possibilità di utilizzare flussi elevati. La colonna è stata quindi impiegata per stimare la percentuale di legame di una serie di molecole con differenti caratteristiche chimiche. Successivamente è stata valutata la possibilità di utilizzare una colonna così corta, anche per studi di competizione, ed è stata indagato il legame di una serie di composti al sito I. Infine è stata effettuata una valutazione della stabilità della colonna in seguito ad un uso estensivo. L’uso di supporti cromatografici funzionalizzati con albumine di diversa origine (ratto, cane, guinea pig, hamster, topo, coniglio), può essere proposto come applicazione futura di queste colonne HTS. Infatti, la possibilità di ottenere informazioni del legame dei farmaci in via di sviluppo alle diverse albumine, permetterebbe un migliore paragone tra i dati ottenuti tramite esperimenti in vitro e i dati ottenuti con esperimenti sull’animale, facilitando la successiva estrapolazione all’uomo, con la velocità di un metodo HTS. Inoltre, verrebbe ridotto anche il numero di animali utilizzati nelle sperimentazioni. Alcuni lavori presenti in letteratura dimostrano l’affidabilita di colonne funzionalizzate con albumine di diversa origine [11-13]: l’utilizzo di colonne più corte potrebbe aumentarne le applicazioni.
Resumo:
Background Abstractor training is a key element in creating valid and reliable data collection procedures. The choice between in-person vs. remote or simultaneous vs. sequential abstractor training has considerable consequences for time and resource utilization. We conducted a web-based (webinar) abstractor training session to standardize training across six individual Cancer Research Network (CRN) sites for a study of breast cancer treatment effects in older women (BOWII). The goals of this manuscript are to describe the training session, its participants and participants' evaluation of webinar technology for abstraction training. Findings A webinar was held for all six sites with the primary purpose of simultaneously training staff and ensuring consistent abstraction across sites. The training session involved sequential review of over 600 data elements outlined in the coding manual in conjunction with the display of data entry fields in the study's electronic data collection system. Post-training evaluation was conducted via Survey Monkey©. Inter-rater reliability measures for abstractors within each site were conducted three months after the commencement of data collection. Ten of the 16 people who participated in the training completed the online survey. Almost all (90%) of the 10 trainees had previous medical record abstraction experience and nearly two-thirds reported over 10 years of experience. Half of the respondents had previously participated in a webinar, among which three had participated in a webinar for training purposes. All rated the knowledge and information delivered through the webinar as useful and reported it adequately prepared them for data collection. Moreover, all participants would recommend this platform for multi-site abstraction training. Consistent with participant-reported training effectiveness, results of data collection inter-rater agreement within sites ranged from 89 to 98%, with a weighted average of 95% agreement across sites. Conclusions Conducting training via web-based technology was an acceptable and effective approach to standardizing medical record review across multiple sites for this group of experienced abstractors. Given the substantial time and cost savings achieved with the webinar, coupled with participants' positive evaluation of the training session, researchers should consider this instructional method as part of training efforts to ensure high quality data collection in multi-site studies.
Resumo:
In this dissertation, I discovered that function of TRIM24 as a co-activator of ERα-mediated transcriptional activation is dependent on specific histone modifications in tumorigenic human breast cancer-derived MCF7 cells. In the first part, I proved that TRIM24-PHD finger domain, which recognizes unmethylated histone H3 lysine K4 (H3K4me0), is critical for ERα-regulated transcription. Therefore, when LSD1-mediated demethylation of H3K4 is inhibited, activation of TRIM24-regulated ERα target genes is greatly impaired. Importantly, I demonstrated that TRIM24 and LSD1 are cyclically recruited to estrogen responsive elements (EREs) in a time-dependent manner upon estrogen induction, and depletion of their expression exert corresponding time-dependent effect on target gene activation. I also identified that phosphorylation of histone H3 threonine T6 disrupts TRIM24 from binding to the chromatin and from activating ERα-regulated targets. In the second part, I revealed that TRIM24 depletion has additive effect to LSD1 inhibitor- and Tamoxifen-mediated reduction in survival and proliferation in breast cancer cells.
Resumo:
Alpine glacier samples were collected in four contrasting regions to measure supraglacial dust and debris geochemical composition. A total of 70 surface glacier ice, snow and debris samples were collected in 2009 and 2010 in Svalbard, Norway, Nepal and New Zealand. Trace elemental abundances in snow and ice samples were measured via inductively coupled plasma mass spectrometry (ICP-MS). Supraglacial debris mineral, bulk oxide and trace element composition were determined via X-ray diffraction (XRD) and X-ray fluorescence spectroscopy (XRF). A total of 45 elements and 10 oxide compound abundances are reported. The uniform data collection procedure, analytical measurement methods and geochemical comparison techniques are used to evaluate supraglacial dust and debris composition variability in the contrasting glacier study regions. Elemental abundances revealed sea salt aerosol and metal enrichment in Svalbard, low levels of crustal dust and marine influences to southern Norway, high crustal dust and anthropogenic enrichment in the Khumbu Himalayas, and sulfur and metals attributed to quiescent degassing and volcanic activity in northern New Zealand. Rare earth element and Al/Ti elemental ratios demonstrated distinct provenance of particulates in each study region. Ca/S elemental ratio data showed seasonal denudation in Svalbard and Norway. Ablation season atmospheric particulate transport trajectories were mapped in each of the study regions and suggest provenance pathways. The in situ data presented provides first order glacier surface geochemical variability as measured from four diverse alpine glacier regions. This geochemical surface glacier data is relevant to glaciologic ablation rate understanding as well as satellite atmospheric and land-surface mapping techniques currently in development.
Resumo:
The 50 km-long West Valley segment of the northern Juan de Fuca Ridge is a young, extension-dominated spreading centre, with volcanic activity concentrated in its southern half. A suite of basalts dredged from the West Valley floor, the adjacent Heck Seamount chain, and a small near-axis cone here named Southwest Seamount, includes a spectrum of geochemical compositions ranging from highly depleted normal (N-) MORB to enriched (E-) MORB. Heck Seamount lavas have chondrite-normalized La/Sm en -0.3, 87Sr/86Sr = 0.70235 - 0.70242, and 206Pb/204Pb = 18.22 - 18.44, requiring a source which is highly depleted in trace elements both at the time of melt generation and over geologic time. The E-MORB from Southwest Seamount have La/Sm en -1.8, 87Sr/86Sr = 0.70245 - 0.70260, and 206Pb/204Pb = 18.73 - 19.15, indicating a more enriched source. Basalts from the West Valley floor have chemical compositions intermediate between these two end-members. As a group, West Valley basalts from a two-component mixing array in element-element and element-isotope plots which is best explained by magma mixing. Evidence for crustal-level magma mixing in some basalts includes mineral-melt chemical and isotopic disequilibrium, but mixing of melts at depth (within the mantle) may also occur. The mantle beneath the northern Juan de Fuca Ridge is modelled as a plum-pudding, with "plums" of enriched, amphibole-bearing peridotite floating in a depleted matrix (DM). Low degrees of melting preferentially melt the "plums", initially removing only the amphibole component and producing alkaline to transitional E-MORB. Higher degrees of melting tap both the "plums" and the depleted matrix to yield N-MORB. The subtly different isotopic compositions of the E-MORBs compared to the N-MORBs require that any enriched component in the upper mantle was derived from a depleted source. If the enriched component crystallized from fluids with a DM source, the "plums" could evolve to their more evolved isotopic composition after a period of 1.5-2.0 Ga. Alternatively, the enriched component could have formed recently from fluids with a lessdepleted source than DM, such as subducted oceanic crust. A third possibility is that enriched material might be dispersed as "plums" throughout the upper mantle, transported from depth by mantle plumes.
Resumo:
A technique for onsite application of X-ray fluorescence (XRF) spectrometry to samples from sediment cores aboard a research vessel was developed and tested. The method is sufficiently simple, precise, and fast to be used routinely for high-resolution analyses of depth profiles as well as surface samples. Analyses were performed with the compact high-performance energy-dispersive polarisation X-ray fluorescence (EDPXRF) analyser Spectro Xepos. Contents of the elements Si, Ti, Al, Fe, Mn, Mg, Ca, K, Sr, Ba, Rb, Cu, Ni, Zn, P, S, Cl and Br were simultaneously determined on 200-225 samples of each core within 24 h of recovery. This study presents a description of the employed shipboard preparation and analysis technique, along with some example data. We show land-based datasets that support our decisions to use powder samples and to reduce the original measuring time for onboard analyses. We demonstrate how well the results from shipboard measurements for the various elements compare with the land-based findings. The onboard geochemical data enabled us to establish an element stratigraphy already during the cruise. Correlation of iron, calcium and silicon enrichment trends with an older reference core provided an age model for the newly retrieved cores. The Spectro Xepos instrument performed without any analytical and technical difficulties which could have been caused by rougher weather conditions or continuous movement and vibration of the research vessel. By now, this XRF technique has been applied during three RV Meteor cruises to approximately 5,000 Late Quaternary sediment samples from altogether 23 gravity cores, 25 multicorer cores and two box cores from the eastern South Atlantic off South Africa/Namibia and the eastern Atlantic off NW Africa.
Resumo:
Laminated sediment records from the oxygen minimum zone in the Arabian Sea offer unique ultrahigh-resolution archives for deciphering climate variability in the Arabian Sea region. Although numerous analytical techniques are available it has become increasingly popular during the past decade to analyze relative variations of sediment cores' chemical signature by non-destructive X-ray fluorescence (XRF) core scanning. We carefully selected an approximately 5 m long sediment core from the northern Arabian Sea (GeoB12309-5: 24°52.3' N; 62°59.9' E, 956 m water depth) for a detailed, comparative study of high-resolution techniques, namely non-destructive XRF core scanning (0.8 mm resolution) and ICP-MS/OES analysis on carefully selected, discrete samples (1 mm resolution). The aim of our study was to more precisely define suitable chemical elements that can be accurately analyzed and to determine which elemental ratios can be interpretated down to sub-millimeter-scale resolutions. Applying the Student's t-test our results show significantly correlating (1% significance level) elemental patterns for all S, Ca, Fe, Zr, Rb, and Sr, as well as the K/Ca, Fe/Ti and Ti/Al ratios that are all related to distinct lithological changes. After careful consideration of all errors for the ICP analysis we further provide respective factors of XRF Core Scanner software error's underestimation by applying Chi-square-tests, which is especially relevant for elements with high count rates. As demonstrated by these new, ultra-high resolution data core scanning has major advantages (high-speed, low costs, few sample preparation steps) and represents an increasingly required alternative over the time consuming, expensive, elaborative, and destructive wet chemical analyses (e.g., by ICP-MS/OES after acid digestions), and meanwhile also provides high-quality data in unprecedented resolution.
Resumo:
Small-scale shear zones are present in drillcore samples of abyssal peridotites from the Mid-Atlantic ridge at 15°20'N (Ocean Drilling Program Leg 209). The shear zones act as pathways for both evolved melts and hydrothermal fluids. We examined serpentinites directly adjacent to such zones to evaluate chemical changes resulting from melt-rock and fluid-rock interaction and their influence on the mineralogy. Compared to fresh harzburgite and melt-unaffected serpentinites, serpentinites adjacent to melt-bearing veins show a marked enrichment in rare earth elements (REE), strontium and high field strength elements (HFSE) zirconium and niobium. From comparison with published chemical data of variably serpentinized and melt-unaffected harzburgites, one possible interpretation is that interaction with the adjacent melt veins caused the enrichment in HFSE, whereas the REE contents might also be enriched due to hydrothermal processes. Enrichment in alumina during serpentinization is corroborated by reaction path models for interaction of seawater with harzburgite-plagiogranite mixtures. These models explain both increased amounts of alumina in the serpentinizing fluid for increasing amounts of plagiogranitic material mixed with harzburgite, and the absence of brucite from the secondary mineralogy due to elevated silica activity. By destabilizing brucite, nearby melt veins might fundamentally influence the low-temperature alteration behaviour of serpentinites. Although observations and model results are in general agreement, due to absence of any unaltered protolith a quantification of element transport during serpentinization is not straightforward.
Resumo:
RESUMEN Las aplicaciones de los Sistemas de Información Geográfica (SIG) a la Arqueología, u otra disciplina humanística no son una novedad. La evolución de los mismos hacia sistemas distribuidos e interoperables, y estructuras donde las políticas de uso, compartido y coordinado de los datos sí lo son, estando todos estos aspectos contemplados en la Infraestructura de Datos Espaciales. INSPIRE es el máximo exponente europeo en cuestiones de iniciativa y marco legal en estos aspectos. La metodología arqueológica recopila y genera gran cantidad de datos, y entre los atributos o características intrínsecas están la posición y el tiempo, aspectos que tradicionalmente explotan los SIG. Los datos se catalogan, organizan, mantienen, comparten y publican, y los potenciales consumidores comienzan a tenerlos disponibles. Toda esta información almacenada de forma tradicional en fichas y posteriormente en bases de datos relacionadas alfanuméricas pueden ser considerados «metadatos» en muchos casos por contener información útil para más usuarios en los procesos de descubrimiento, y explotación de los datos. Además estos datos también suelen ir acompañados de información sobre ellos mismos, que describe su especificaciones, calidad, etc. Cotidianamente usamos los metadatos: ficha bibliográfica del libro o especificaciones de un ordenador. Pudiéndose definir como: «información descriptiva sobre el contexto, calidad, condición y características de un recurso, dato u objeto que tiene la finalidad de facilitar su recuperación, identificación,evaluación, preservación y/o interoperabilidad». En España existe una iniciativa para estandarizar la descripción de los metadatos de los conjuntos de datos geoespaciales: Núcleo Español de Metadatos (NEM), los mismos contienen elementos para la descripción de las particularidades de los datos geográficos, que incluye todos los registros obligatorios de la Norma ISO19115 y del estudio de metadatos Dublin Core, tradicionalmente usado en contextos de Biblioteconomía. Conscientes de la necesidad de los metadatos, para optimizar la búsqueda y recuperación de los datos, se pretende formalizar la documentación de los datos arqueológicos a partir de la utilización del NEM, consiguiendo así la interoperabilidad de la información arqueológica. SUMMARY The application of Geographical Information Systems (GIS) to Archaeology and other social sciences is not new. Their evolution towards inter-operating, distributed systems, and structures in which policies for shared and coordinated data use are, and all these aspects are included in the Spatial Data Infrastructure (SDI). INSPIRE is the main European exponent in matters related to initiative and legal frame. Archaeological methodology gathers and creates a great amount of data, and position and time, aspects traditionally exploited by GIS, are among the attributes or intrinsic characteristics. Data are catalogued, organised, maintained, shared and published, and potential consumers begin to have them at their disposal. All this information, traditionally stored as cards and later in relational alphanumeric databases may be considered «metadata» in many cases, as they contain information that is useful for more users in the processes of discovery and exploitation of data. Moreover, this data are often accompanied by information about themselves, describing its especifications, quality, etc. We use metadata very often: in a book’s bibliographical card, or in the description of the characteristics of a computer. They may be defined as «descriptive information regarding the context, quality, condition and characteristics of a resource, data or object with the purpose of facilitating is recuperation, identification, evaluation, preservation and / interoperability.» There is an initiative in Spain to standardise the description of metadata in sets of geo-spatial data: the Núcleo Español de Metadatos (Spanish Metadata Nucleus), which contains elements for the description of the particular characteristics of geographical data, includes all the obligatory registers from the ISO Norm 19115 and from the metadata study Dublin Core, traditionally used in library management. Being aware of the need of metadata, to optimise the search and retrieval of data, the objective is to formalise the documentation of archaeological data from the Núcleo Español de Metadatos (Spanish Metadata Nucleus), thus obtaining the interoperability of the archaeological information.
Resumo:
One of the main limiting factors in the development of new magnesium (Mg) alloys with enhanced mechanical behavior is the need to use vast experimental campaigns for microstructure and property screening. For example, the influence of new alloying additions on the critical resolved shear stresses (CRSSs) is currently evaluated by a combination of macroscopic single-crystal experiments and crystal plasticity finite-element simulations (CPFEM). This time-consuming process could be considerably simplified by the introduction of high-throughput techniques for efficient property testing. The aim of this paper is to propose a new and fast, methodology for the estimation of the CRSSs of hexagonal close-packed metals which, moreover, requires small amounts of material. The proposed method, which combines instrumented nanoindentation and CPFEM modeling, determines CRSS values by comparison of the variation of hardness (H) for different grain orientations with the outcome of CPFEM. This novel approach has been validated in a rolled and annealed pure Mg sheet, whose H variation with grain orientation has been successfully predicted using a set of CRSSs taken from recent crystal plasticity simulations of single-crystal experiments. Moreover, the proposed methodology has been utilized to infer the effect of the alloying elements of an MN11 (Mg–1% Mn–1% Nd) alloy. The results support the hypothesis that selected rare earth intermetallic precipitates help to bring the CRSS values of basal and non-basal slip systems closer together, thus contributing to the reduced plastic anisotropy observed in these alloys
Resumo:
Como contribución del estudio de medios heterogéneos, esta tesis recoge el trabajo llevado a cabo sobre modelado teórico y simulación del estudio de las propiedades ópticas de la piel y del agua del mar, como ejemplos paradigmáticos de medios heterogéneos. Se ha tomado como punto de partida el estudio de la propagación de la radiación óptica, más concretamente de la radiación láser, en un tejido biológico. La importancia de la caracterización óptica de un tejido es fundamental para manejar la interacción radiación-tejido que permite tanto el diagnóstico como la terapéutica de enfermedades y/o de disfunciones en las Ciencias de la Salud. Sin olvidar el objetivo de ofrecer una metodología de estudio, con un «enfoque ingenieril», de las propiedades ópticas en un medio heterogéneo, que no tiene por qué ser exclusivamente el tejido biológico. Como consecuencia de lo anterior y de la importancia que tiene el agua dentro de los tejidos biológicos se decide estudiar en otro capítulo las propiedades ópticas del agua dentro de un entorno heterogéneo como es el agua del mar. La selección del agua del mar, como objeto de estudio adicional, es motivada, principalmente, porque se trata de un sistema heterogéneo fácilmente descriptible en cada uno de sus elementos y permite evaluar una amplia bibliografía. Además se considera que los avances que han tenido lugar en los últimos años en las tecnologías fotónicas van a permitir su uso en los métodos experimentales de análisis de las aguas. El conocimiento de sus propiedades ópticas permite caracterizar los diferentes tipos de aguas de acuerdo con sus compuestos, así como poder identificar su presencia. Todo ello abre un amplio abanico de aplicaciones. En esta tesis doctoral, se ha conseguido de manera general: • Realizar un estudio del estado del arte del conocimiento de las propiedades ópticas de la piel y la identificación de sus elementos dispersores de la luz. • Establecer una metodología de estudio que nos permita obtener datos sobre posibles efectos de la radiación en los tejidos biológicos. •Usar distintas herramientas informáticas para simular el transporte de la radiación laser en tejidos biológicos. • Realizar experimentos mediante simulación de láser, tejidos biológicos y detectores. • Comparar los resultados conocidos experimentalmente con los simulados. • Estudiar los instrumentos de medida de la respuesta a la propagación de radiación laser en tejidos anisotrópicos. • Obtener resultados originales para el diagnóstico y tratamiento de pieles, considerando diferente razas y como alteración posible en la piel, se ha estudiado la presencia del basalioma. • Aplicación de la metodología de estudio realizada en la piel a la simulación de agua de mar. • Obtener resultados originales de simulación y análisis de cantidad de fitoplancton en agua; con el objetivo de facilitar la caracterización de diferentes tipos de aguas. La tesis doctoral se articula en 6 capítulos y 3 anexos perfectamente diferenciados con su propia bibliografía en cada uno de ellos. El primer capítulo está centrado en la problemática del difícil estudio y caracterización de los medios heterogéneos debidos a su comportamiento no homogéneo y anisotrópico ante las radiaciones ópticas. Así pues, presentaremos una breve introducción al comportamiento tanto de los tejidos como del océano ante radiaciones ópticas y definiremos sus principales propiedades: la absorción, el scattering, la anisotropía y los coeficientes de reflexión. Como continuación, un segundo capítulo trata de acercarnos a la resolución del problema de cómo caracterizar las propiedades ópticas descritas en el primer capítulo. Para ello, primero se introducen los modelos teóricos, en segundo lugar los métodos de simulación más empleados y, por último, enumerar las principales técnicas de medida de la propagación de la luz en los tejidos vivos. El tercer capítulo, centrado en la piel y sus propiedades, intenta realizar una síntesis de lo que se conoce sobre el comportamiento de la piel frente a la propagación de las radiaciones ópticas. Se estudian sus elementos constituyentes y los distintos tipos de pieles. Por último se describe un ejemplo de aplicación más inmediata que se beneficia de este conocimiento. Sabemos que el porcentaje de agua en el cuerpo humano es muy elevado, en concreto en la piel se considera de aproximadamente un 70%. Es obvio, por tanto, que conocer cómo afecta el agua en la propagación de una radiación óptica facilitaría el disponer de patrones de referencia; para ello, se realiza el estudio del agua del mar. En el cuarto capítulo se estudian las propiedades del agua del mar como medio heterogéneo de partículas. En este capítulo presentamos una síntesis de los elementos más significativos de dispersores en el océano, un estudio de su comportamiento individual frente a radiaciones ópticas y su contribución al océano en su conjunto. Finalmente, en el quinto capítulo se describen los resultados obtenidos en los distintos tipos de simulaciones realizadas. Las herramientas de simulación empleadas han sido las mismas tanto para el caso del estudio de la piel como para el agua del mar, por ello ambos resultados son expuestos en el mismo capítulo. En el primer caso se analizan diferentes tipos de agua oceánica, mediante la variación de las concentraciones de fitoplancton. El método empleado permite comprobar las diferencias que pueden encontrarse en la caracterización y diagnóstico de aguas. El segundo caso analizado es el de la piel; donde se estudia el comportamiento de distintos tipos de piel, se analizan para validar el método y se comprueba cómo el resultado es compatible con aplicaciones, actualmente comerciales, como la de la depilación con láser. Como resultado significativo se muestra la posible metodología a aplicar para el diagnóstico del cáncer de piel conocido como basalioma. Finalmente presentamos un capítulo dedicado a los trabajos futuros basados en experimentación real y el coste asociado que implicaría el llevarlo a cabo. Los anexos que concluyen la tesis doctoral versan por un lado sobre el funcionamiento del vector común de toda la tesis: el láser, sus aplicaciones y su control en la seguridad y por otro presentamos los coeficientes de absorción y scattering que hemos utilizado en nuestras simulaciones. El primero condensa las principales características de una radiación láser desde el punto de vista de su generación, el segundo presenta la seguridad en su uso y el tercero son tablas propias, cuyos parámetros son los utilizados en el apartado de experimentación. Aunque por el tipo de tesis que defiendo no se ajusta a los modelos canónicos de tesis doctoral, el lector podrá encontrar en esta tesis de forma imbricada, el modelo común a todas las tesis o proyectos de investigación con una sección dedicada al estado del arte con ejemplos pedagógicos para facilitar la compresión y se plantean unos objetivos (capítulos 1-4), y un capítulo que se subdivide en materiales y métodos y resultados y discusiones (capítulo 5 con sus subsecciones), para finalizar con una vista al futuro y los trabajos futuros que se desprenden de la tesis (capítulo 6). ABSTRACT As contribution to the study of heterogeneous media, this thesis covers the work carried out on theoretical modelling and simulation study of the optical properties of the skin and seawater, as paradigmatic examples of heterogeneous media. It is taken as a starting point the study of the propagation of optical radiation, in particular laser radiation in a biological tissue. The importance of optical characterization of a tissue is critical for managing the interaction between radiation and tissues that allows both diagnosis and therapy of diseases and / or dysfunctions in Health Sciences. Without forgetting the aim of providing a methodology of study, with "engineering approach" of the optical properties in a heterogeneous environment, which does not have to be exclusively biological tissue. As a result of this and the importance of water in biological tissues, we have decided to study the optical properties of water in a heterogeneous environment such as seawater in another chapter. The selection of sea water as an object of further study is motivated mainly because it is considered that the advances that have taken place in recent years in photonic technologies will allow its use in experimental methods of water analysis. Knowledge of the optical properties to characterize the different types of waters according to their compounds, as well as to identify its presence. All of this opens a wide range of applications. In this thesis, it has been generally achieved: • Conduct a study of the state of the art knowledge of the optical properties of the skin and identifying its light scattering elements. • Establish a study methodology that allows us to obtain data on possible effects of radiation on biological tissues. • Use different computer tools to simulate the transport of laser radiation in biological tissues. • Conduct experiments by simulating: laser, detectors, and biological tissues. • Compare the known results with our experimentally simulation. • Study the measuring instruments and its response to the propagation of laser radiation in anisotropic tissues. • Get innovative results for diagnosis and treatment of skin, considering different races and a possible alteration in the skin that we studied: the presence of basal cell carcinoma. • Application of the methodology of the study conducted in the skin to simulate seawater. • Get innovative results of simulation and analysis of amount of phytoplankton in water; in order to facilitate the characterization of different types of water. The dissertation is divided into six chapters and three annexes clearly distinguished by their own literature in each of them. The first chapter is focused on the problem of difficult study and characterization of heterogeneous media due to their inhomogeneous and anisotropic behaviour of optical radiation. So we present a brief introduction to the behaviour of both tissues at the cellular level as the ocean, to optical radiation and define the main optical properties: absorption, scattering, anisotropy and reflection coefficients. Following from this, a second chapter is an approach to solving the problem of how to characterize the optical properties described in the first chapter. For this, first the theoretical models are introduced, secondly simulation methods more used and, finally, the main techniques for measuring the propagation of light in living tissue. The third chapter is focused on the skin and its properties, tries to make a synthesis of what is known about the behaviour of the skin and its constituents tackle the spread of optical radiation. Different skin types are studied and an example of immediate application of this knowledge benefits described. We know that the percentage of water in the human body is very high, particularly in the skin is considered about 70%. It is obvious, therefore, that knowing how the water is affected by the propagation of an optical radiation facilitate to get reference patterns; For this, the study of seawater is performed. In the fourth chapter the properties of seawater as a heterogeneous component particles are studied. This chapter presents a summary of the scattering elements in the ocean, its individual response to optical radiation and its contribution to the ocean as a whole. In the fifth chapter the results of the different types of simulations are described. Simulation tools used were the same for the study of skin and seawater, so both results are presented in the chapter. In the first case different types of ocean water is analysed by varying the concentrations of phytoplankton. The method allows to check the differences that can be found in the characterization and diagnosis of water. The second case analysed is the skin; where the behaviour of different skin types are studied and checked how the result is compatible with applications currently trade, such as laser hair removal. As a significant result of the possible methodology to be applied for the diagnosis of skin cancer known as basal cell carcinoma is shown. Finally we present a chapter on future work based on actual experimentation and the associated cost which it would involve carrying out. The annexes conclude the thesis deal with one hand on the functioning of the common vector of the whole thesis: laser, control applications and safety and secondly we present the absorption and scattering coefficients we used in our simulations. The first condenses the main characteristics of laser radiation from the point of view of their generation, the second presents the safety in use and the third are own tables, whose parameters are used in the experimental section. Although the kind of view which I advocate does not meet the standard models doctoral thesis, the reader will find in this thesis so interwoven, the common model to all theses or research projects with a section on the state of the art pedagogical examples to facilitate the understanding and objectives (Chapters 1-4), and a chapter is divided into materials and methods and results and discussions (Chapter 5 subsections) arise, finishing with a view to the future and work future arising from the thesis (Chapter 6).
Resumo:
Reatores tubulares de polimerização podem apresentar um perfil de velocidade bastante distorcido. Partindo desta observação, um modelo estocástico baseado no modelo de dispersão axial foi proposto para a representação matemática da fluidodinâmica de um reator tubular para produção de poliestireno. A equação diferencial foi obtida inserindo a aleatoriedade no parâmetro de dispersão, resultando na adição de um termo estocástico ao modelo capaz de simular as oscilações observadas experimentalmente. A equação diferencial estocástica foi discretizada e resolvida pelo método Euler-Maruyama de forma satisfatória. Uma função estimadora foi desenvolvida para a obtenção do parâmetro do termo estocástico e o parâmetro do termo determinístico foi calculado pelo método dos mínimos quadrados. Uma análise de convergência foi conduzida para determinar o número de elementos da discretização e o modelo foi validado através da comparação de trajetórias e de intervalos de confiança computacionais com dados experimentais. O resultado obtido foi satisfatório, o que auxilia na compreensão do comportamento fluidodinâmico complexo do reator estudado.
Resumo:
This communication develops the process of interventions of the Renaissance fortress of a new plant built in 1554–57 in Santa Pola. It is one of the earliest examples built with reference to military architecture theoretical treaties (XV–XVI) and best preserved. The study runs its own story from its initial military use, through the use of civil equipment until the final cultural and Museum Center. First, the project of Italian origin is examined and its use as barracks for troops for a duration of three centuries (1557–1850), pointing out the architectural constants of war machinery in a defense position and its origin as a rainwater collector and cistern: a perfect square with two bastions in which a plan of the uprising is preserved (1778). Secondly, we study the changes in the mentioned architecture throughout a century and a half (1850–1990) after its change of ownership (from the state to the municipality), and as a result of the new use as a city hall and public endowment: a market and health and leisure centre, which meant the demolition of defensive elements and the opening up to the outside of the inner parade ground. And thirdly, the new transfer of the municipal offices brings in the beginning of a project of transformations (1990–2015) that retrieves the demolished elements at the same time as it assigns the entire fort for a cultural centre: exhibition, research and history museum, promoting the identity between the citizens and the building which stands in the foundations of their city. The conclusions take us through an interesting route that goes from the approach of defensive tactics, its use as administrative headquarters to the current cultural policy of preservation. In addition, all the known plans of the fort are recovered (of military, civil and cultural use), some unpublished, as well as the project of the North wing that has guided the last operation and which has been set as a pattern of reference.
Resumo:
Eukaryotic genomes display segmental patterns of variation in various properties, including GC content and degree of evolutionary conservation. DNA segmentation algorithms are aimed at identifying statistically significant boundaries between such segments. Such algorithms may provide a means of discovering new classes of functional elements in eukaryotic genomes. This paper presents a model and an algorithm for Bayesian DNA segmentation and considers the feasibility of using it to segment whole eukaryotic genomes. The algorithm is tested on a range of simulated and real DNA sequences, and the following conclusions are drawn. Firstly, the algorithm correctly identifies non-segmented sequence, and can thus be used to reject the null hypothesis of uniformity in the property of interest. Secondly, estimates of the number and locations of change-points produced by the algorithm are robust to variations in algorithm parameters and initial starting conditions and correspond to real features in the data. Thirdly, the algorithm is successfully used to segment human chromosome 1 according to GC content, thus demonstrating the feasibility of Bayesian segmentation of eukaryotic genomes. The software described in this paper is available from the author's website (www.uq.edu.au/similar to uqjkeith/) or upon request to the author.
Resumo:
In this paper the low autocorrelation binary sequence problem (LABSP) is modeled as a mixed integer quadratic programming (MIQP) problem and proof of the model’s validity is given. Since the MIQP model is semidefinite, general optimization solvers can be used, and converge in a finite number of iterations. The experimental results show that IQP solvers, based on this MIQP formulation, are capable of optimally solving general/skew-symmetric LABSP instances of up to 30/51 elements in a moderate time. ACM Computing Classification System (1998): G.1.6, I.2.8.