14 resultados para Best Possible Medication History (BPMH)

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A single picture provides a largely incomplete representation of the scene one is looking at. Usually it reproduces only a limited spatial portion of the scene according to the standpoint and the viewing angle, besides it contains only instantaneous information. Thus very little can be understood on the geometrical structure of the scene, the position and orientation of the observer with respect to it remaining also hard to guess. When multiple views, taken from different positions in space and time, observe the same scene, then a much deeper knowledge is potentially achievable. Understanding inter-views relations enables construction of a collective representation by fusing the information contained in every single image. Visual reconstruction methods confront with the formidable, and still unanswered, challenge of delivering a comprehensive representation of structure, motion and appearance of a scene from visual information. Multi-view visual reconstruction deals with the inference of relations among multiple views and the exploitation of revealed connections to attain the best possible representation. This thesis investigates novel methods and applications in the field of visual reconstruction from multiple views. Three main threads of research have been pursued: dense geometric reconstruction, camera pose reconstruction, sparse geometric reconstruction of deformable surfaces. Dense geometric reconstruction aims at delivering the appearance of a scene at every single point. The construction of a large panoramic image from a set of traditional pictures has been extensively studied in the context of image mosaicing techniques. An original algorithm for sequential registration suitable for real-time applications has been conceived. The integration of the algorithm into a visual surveillance system has lead to robust and efficient motion detection with Pan-Tilt-Zoom cameras. Moreover, an evaluation methodology for quantitatively assessing and comparing image mosaicing algorithms has been devised and made available to the community. Camera pose reconstruction deals with the recovery of the camera trajectory across an image sequence. A novel mosaic-based pose reconstruction algorithm has been conceived that exploit image-mosaics and traditional pose estimation algorithms to deliver more accurate estimates. An innovative markerless vision-based human-machine interface has also been proposed, so as to allow a user to interact with a gaming applications by moving a hand held consumer grade camera in unstructured environments. Finally, sparse geometric reconstruction refers to the computation of the coarse geometry of an object at few preset points. In this thesis, an innovative shape reconstruction algorithm for deformable objects has been designed. A cooperation with the Solar Impulse project allowed to deploy the algorithm in a very challenging real-world scenario, i.e. the accurate measurements of airplane wings deformations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is the creation of a Historical GIS that spatially reference data retrieved from Italian and Catalan historical sources and records. The generation of locates these metasource was achieved through the integral acquisition of source-oriented records and the insertion of mark-up fields, yet maintaining, where possible, the original encoding of the source documents. In order to standardize the set of information contained in the original documents and thus allow queries to the database, additional fields were introduced. Once the initial phase of data research and analysis was concluded the new virtual source was published online within an open WebGIS source. As a conclusion we have created a dynamic and spatially referenced database of geo-historical information. The configuration of this new source is such to guarantee the best possible accessibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis starts showing the main characteristics and application fields of the AlGaN/GaN HEMT technology, focusing on reliability aspects essentially due to the presence of low frequency dispersive phenomena which limit in several ways the microwave performance of this kind of devices. Based on an equivalent voltage approach, a new low frequency device model is presented where the dynamic nonlinearity of the trapping effect is taken into account for the first time allowing considerable improvements in the prediction of very important quantities for the design of power amplifier such as power added efficiency, dissipated power and internal device temperature. An innovative and low-cost measurement setup for the characterization of the device under low-frequency large-amplitude sinusoidal excitation is also presented. This setup allows the identification of the new low frequency model through suitable procedures explained in detail. In this thesis a new non-invasive empirical method for compact electrothermal modeling and thermal resistance extraction is also described. The new contribution of the proposed approach concerns the non linear dependence of the channel temperature on the dissipated power. This is very important for GaN devices since they are capable of operating at relatively high temperatures with high power densities and the dependence of the thermal resistance on the temperature is quite relevant. Finally a novel method for the device thermal simulation is investigated: based on the analytical solution of the tree-dimensional heat equation, a Visual Basic program has been developed to estimate, in real time, the temperature distribution on the hottest surface of planar multilayer structures. The developed solver is particularly useful for peak temperature estimation at the design stage when critical decisions about circuit design and packaging have to be made. It facilitates the layout optimization and reliability improvement, allowing the correct choice of the device geometry and configuration to achieve the best possible thermal performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-organising pervasive ecosystems of devices are set to become a major vehicle for delivering infrastructure and end-user services. The inherent complexity of such systems poses new challenges to those who want to dominate it by applying the principles of engineering. The recent growth in number and distribution of devices with decent computational and communicational abilities, that suddenly accelerated with the massive diffusion of smartphones and tablets, is delivering a world with a much higher density of devices in space. Also, communication technologies seem to be focussing on short-range device-to-device (P2P) interactions, with technologies such as Bluetooth and Near-Field Communication gaining greater adoption. Locality and situatedness become key to providing the best possible experience to users, and the classic model of a centralised, enormously powerful server gathering and processing data becomes less and less efficient with device density. Accomplishing complex global tasks without a centralised controller responsible of aggregating data, however, is a challenging task. In particular, there is a local-to-global issue that makes the application of engineering principles challenging at least: designing device-local programs that, through interaction, guarantee a certain global service level. In this thesis, we first analyse the state of the art in coordination systems, then motivate the work by describing the main issues of pre-existing tools and practices and identifying the improvements that would benefit the design of such complex software ecosystems. The contribution can be divided in three main branches. First, we introduce a novel simulation toolchain for pervasive ecosystems, designed for allowing good expressiveness still retaining high performance. Second, we leverage existing coordination models and patterns in order to create new spatial structures. Third, we introduce a novel language, based on the existing ``Field Calculus'' and integrated with the aforementioned toolchain, designed to be usable for practical aggregate programming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changing or creating an organisation means creating a new process. Each process involves many risks that need to be identified and managed. The main risks considered here are procedural and legal risks. The former are related to the risks of errors that may occur during processes, while the latter are related to the compliance of processes with regulations. Managing the risks implies proposing changes to the processes that allow the desired result: an optimised process. In order to manage a company and optimise it in the best possible way, not only should the organisational aspect, risk management and legal compliance be taken into account, but it is important that they are all analysed simultaneously with the aim of finding the right balance that satisfies them all. This is the aim of this thesis, to provide methods and tools to balance these three characteristics, and to enable this type of optimisation, ICT support is used. This work isn’t a thesis in computer science or law, but rather an interdisciplinary thesis. Most of the work done so far is vertical and in a specific domain. The particularity and aim of this thesis is not to carry out an in-depth analysis of a particular aspect, but rather to combine several important aspects, normally analysed separately, which however have an impact and influence each other. In order to carry out this kind of interdisciplinary analysis, the knowledge base of both areas was involved and the combination and collaboration of different experts in the various fields was necessary. Although the methodology described is generic and can be applied to all sectors, the case study considered is a new type of healthcare service that allows patients in acute disease to be hospitalised to their home. This provide the possibility to perform experiments using real hospital database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, a thorough investigation on acoustic noise control systems for realistic automotive scenarios is presented. The thesis is organized in two parts dealing with the main topics treated: Active Noise Control (ANC) systems and Virtual Microphone Technique (VMT), respectively. The technology of ANC allows to increase the driver's/passenger's comfort and safety exploiting the principle of mitigating the disturbing acoustic noise by the superposition of a secondary sound wave of equal amplitude but opposite phase. Performance analyses of both FeedForwrd (FF) and FeedBack (FB) ANC systems, in experimental scenarios, are presented. Since, environmental vibration noises within a car cabin are time-varying, most of the ANC solutions are adaptive. However, in this work, an effective fixed FB ANC system is proposed. Various ANC schemes are considered and compared with each other. In order to find the best possible ANC configuration which optimizes the performance in terms of disturbing noise attenuation, a thorough research of \gls{KPI}, system parameters and experimental setups design, is carried out. In the second part of this thesis, VMT, based on the estimation of specific acoustic channels, is investigated with the aim of generating a quiet acoustic zone around a confined area, e.g., the driver's ears. Performance analysis and comparison of various estimation approaches is presented. Several measurement campaigns were performed in order to acquire a sufficient duration and number of microphone signals in a significant variety of driving scenarios and employed cars. To do this, different experimental setups were designed and their performance compared. Design guidelines are given to obtain good trade-off between accuracy performance and equipment costs. Finally, a preliminary analysis with an innovative approach based on Neural Networks (NNs) to improve the current state of the art in microphone virtualization is proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this PhD thesis is to study accurately and in depth the figure and the literary production of the intellectual Jacopo Aconcio. This minor author of the 16th century has long been considered a sort of “enigmatic character”, a profile which results from the work of those who, for many centuries, have left his writing to its fate: a story of constant re-readings and equally incessant oversights. This is why it is necessary to re-read Aconcio’s production in its entirety and to devote to it a monographic study. Previous scholars’ interpretations will obviously be considered, but at the same time an effort will be made to go beyond them through the analysis of both published and manuscript sources, in the attempt to attain a deeper understanding of the figure of this man, who was a Christian, a military and hydraulic engineer and a political philosopher,. The title of the thesis was chosen to emphasise how, throughout the three years of the doctorate, my research concentrated in equal measure and with the same degree of importance on all the reflections and activities of Jacopo Aconcio. My object, in fact, was to establish how and to what extent the methodological thinking of the intellectual found application in, and at the same time guided, his theoretical and practical production. I did not mention in the title the author’s religious thinking, which has always been considered by everyone the most original and interesting element of his production, because religion, from the Reformation onwards, was primarily a political question and thus it was treated by almost all the authors involved in the Protestant movement - Aconcio in the first place. Even the remarks concerning the private, intimate sphere of faith have therefore been analysed in this light: only by acknowledging the centrality of the “problem of politics” in Aconcio’s theories, in fact, is it possible to interpret them correctly. This approach proves the truth of the theoretical premise to my research, that is to say the unity and orderliness of the author’s thought: in every field of knowledge, Aconcio applies the rules of the methodus resolutiva, as a means to achieve knowledge and elaborate models of pacific cohabitation in society. Aconcio’s continuous references to method can make his writing pedant and rather complex, but at the same time they allow for a consistent and valid analysis of different disciplines. I have not considered the fact that most of his reflections appear to our eyes as strongly conditioned by the time in which he lived as a limit. To see in him, as some have done, the forerunner of Descartes’ methodological discourse or, conversely, to judge his religious theories as not very modern, is to force the thought of an author who was first and foremost a Christian man of his own time. Aconcio repeats this himself several times in his writings: he wants to provide individuals with the necessary tools to reach a full-fledged scientific knowledge in the various fields, and also to enable them to seek truth incessantly in the religious domain, which is the duty of every human being. The will to find rules, instruments, effective solutions characterizes the whole of the author’s corpus: Aconcio feels he must look for truth in all the arts, aware as he is that anything can become science as long as it is analysed with method. Nevertheless, he remains a man of his own time, a Christian convinced of the existence of God, creator and governor of the world, to whom people must account for their own actions. To neglect this fact in order to construct a “character”, a generic forerunner, but not participant, of whatever philosophical current, is a dangerous and sidetracking operation. In this study, I have highlighted how Aconcio’s arguments only reveal their full meaning when read in the context in which they were born, without depriving them of their originality but also without charging them with meanings they do not possess. Through a historical-doctrinal approach, I have tried to analyse the complex web of theories and events which constitute the substratum of Aconcio’s reflection, in order to trace the correct relations between texts and contexts. The thesis is therefore organised in six chapters, dedicated respectively to Aconcio’s biography, to the methodological question, to the author’s engineering activity, to his historical knowledge and to his religious thinking, followed by a last section concerning his fortune throughout the centuries. The above-mentioned complexity is determined by the special historical moment in which the author lived. On the one hand, thanks to the new union between science and technique, the 16th century produces discoveries and inventions which make available a previously unthinkable number of notions and lead to a “revolution” in the way of studying and teaching the different subjects, which, by producing a new form of intellectual, involved in politics but also aware of scientific-technological issues, will contribute to the subsequent birth of modern science. On the other, the 16th century is ravaged by religious conflicts, which shatter the unity of the Christian world and generate theological-political disputes which will inform the history of European states for many decades. My aim is to show how Aconcio’s multifarious activity is the conscious fruit of this historical and religious situation, as well as the attempt of an answer to the request of a new kind of engagement on the intellectual’s behalf. Plunged in the discussions around methodus, employed in the most important European courts, involved in the abrupt acceleration of technical-scientific activities, and especially concerned by the radical religious reformation brought on by the Protestant movement, Jacopo Aconcio reflects this complex conjunction in his writings, without lacking in order and consistency, differently from what many scholars assume. The object of this work, therefore, is to highlight the unity of the author’s thought, in which science, technique, faith and politics are woven into a combination which, although it may appear illogical and confused, is actually tidy and methodical, and therefore in agreement with Aconcio’s own intentions and with the specific characters of European culture in the Renaissance. This theory is confirmed by the reading of the Ars muniendorum oppidorum, Aconcio’s only work which had been up till now unavailable. I am persuaded that only a methodical reading of Aconcio’s works, without forgetting nor glorifying any single one, respects the author’s will. From De methodo (1558) onwards, all his writings are summae, guides for the reader who wishes to approach the study of the various disciplines. Undoubtedly, Satan’s Stratagems (1565) is something more, not only because of its length, but because it deals with the author’s main interest: the celebration of doubt and debate as bases on which to build religious tolerance, which is the best method for pacific cohabitation in society. This, however, does not justify the total centrality which the Stratagems have enjoyed for centuries, at the expense of a proper understanding of the author’s will to offer examples of methodological rigour in all sciences. Maybe it is precisely because of the reforming power of Aconcio’s thought that, albeit often forgotten throughout the centuries, he has never ceased to reappear and continues to draw attention, both as a man and as an author. His ideas never stop stimulating the reader’s curiosity and this may ultimately be the best demonstration of their worth, independently from the historical moment in which they come back to the surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Variscan basement of Northern Apennines (Northern Italy) is a polymetamorphic portion of continental crust. This thesis investigated the metamorphic history of this basement occurring in the Cerreto Pass, in the Pontremoli well, and in the Pisani Mountains. The study comprised fieldwork, petrography and microstructural analysis, determination of the bulk rock and mineral composition, thermodynamic modelling, conventional geothermobarometry, monazite chemical dating and Ar/Ar dating of muscovite. The reconstructed metamorphic evolution of the selected samples allowed to define a long-lasting metamorphic history straddling the Variscan and Alpine orogenesis. Some general petrological issues generally found in low- to medium-grade metapelites were also tackled: (i) With middle-grade micaschist it is possible to reconstruct a complete P-T-D path by combining microstructural analysis and thermodynamic modelling. Prekinematic white mica may preserve Mg-rich cores related to the pre-peak stage. Mn-poor garnet rim records the peak metamorphism. Na-rich mylonitic white mica, the XFe of chlorite and the late paragenesis may constrain the retrograde stage. (ii) Metapelites may contain coronitic microstructures of apatite + Th-silicate, allanite and epidote around unstable monazite grains. Chemistry and microstructure of Th-rich monazite relics surrounded by this coronitic microstructure may suggest that monazite mineral was inherited and underwent partial dissolution and fluid-aided replacement by REE-accessory minerals at 500-600°C and 5-7 kbar. (iii) Fish-shaped white mica is not always a (prekinematic) mica-fish. Observed at high-magnification BSE images it may consist of several white mica formed during a mylonitic stage. Hence, the asymmetric foliation boudin is a suitable microstructure to obtain geochronological information about the shearing stage. (iv) Thermodynamic modelling of a hematite-rich metasedimentary rock fails to reproduce the observed mineral compositions when the bulk Fe2O3 is neglected or determined through titration. The mismatch between observed and computed mineral compositions and assemblage is resolved by tuning the effective ferric iron content by P-XFe2O3 diagrams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ever-growing interest in scientific techniques, able to characterise the materials and rediscover the steps behind the execution of a painting, makes them widely accepted in its investigation. This research discusses issues emerging from attribution and authentication studies and proposes best practise for the characterisation of materials and techniques, favouring the contextualisation of the results in an integrated approach; the work aims to systematically classify paintings in categories that aid the examination of objects. A first grouping of paintings is based on the information initially available on them, identifying four categories. A focus of this study is the examination of case studies, spanning from the 16th to the 20th century, to evaluate and validate different protocols associated to each category, to show problems arising from paintings and explain advantages and limits of the approach. The research methodology incorporates a combined set of scientific techniques (non-invasive, such as technical imaging and XRF, micro-invasive, such as optical microscopy, SEM-EDS, FTIR, Raman microscopy and in one case radiocarbon dating) to answer the questions and, if necessary for the classification, exhaustively characterise the materials of the paintings, as the creation and contribution of shared technical databases related to various artists and their evolution over time is an objective tool that benefits this kind of study. The reliability of a close collaboration among different professionals is an essential aspect of this research to comprehensively study a painting, as the integration of stylistic, documentary and provenance studies corroborates the scientific findings and helps in the successful contextualisation of the results and the reconstruction of the history of the object.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer’s disease (AD) is the most common form of dementia, currently affecting more than 50 million people worldwide. In recent years attention towards this disease has risen in search for discovery and development of a drug that can stop it. Indeed, therapies for AD provide only temporary symptomatic relief. The cause for the high attrition rate for AD drug discovery has been attributed to several factors, including the fact that the AD pathogenesis is not yet fully understood. Nevertheless, what is increasingly recognized is that AD is a multifactorial syndrome, characterized by many conditions which may lead to neuronal death. Given this, it is widely accepted that a molecule able to modulate more than one target would bring benefit to the therapy of AD. In the first chapter of this thesis, there are reported two projects regarding the design and synthesis of new series of GSK-3/HDAC dual inhibitors, two of the main enzymes involved in AD. Two different series of compounds were synthesized and evaluated for their inhibitory activity towards the target enzymes. The best compounds of the series were selected for further biologic investigation to evaluate their properties. The second project focused on the design of non ATP-competitive GSK-3 inhibitors combined with HDAC inhibition properties. Also in this case, the best compounds of the series were selected for biologic investigation to further evaluate their properties. In chapter 2, the design and synthesis of a GSK-3-directed Proteolysis Targeting Chimeras (PROTAC), a new technology in drug discovery that act through degradation rather than inhibition, is reported. The design and synthesis of a small series of GSK-3-directed PROTACs was achieved. In vitro assays were performed to evaluate the GSK-3-degradation ability, the effective involvement of E3 ubiquitine ligase in the process and their neuroprotective abilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La crisis existencial del proyecto de integración europea constituye el principal reto al que debe enfrentarse la Comunidad en su futuro más inmediato puesto que se alza como toda una amenaza para la continuidad del sistema establecido desde el Tratado fundacional de Roma (1957). Las tres crisis acontecidas durante los primeros años del siglo XXI, véase la gran crisis económica del año 2008, la pandemia global de la Covid-19 y la actual intervención de la Federación Rusa en Ucrania, están poniendo constantemente a prueba la firmeza de los cimientos y valores sobre los cuales se ha construido la Comunidad durante todos estos años. Esta tesis doctoral pretende ser una contribución académica al debate abierto en las sociedades comunitarias acerca de por donde debe de transitar la evolución del proyecto de integración europea en los próximos años. Para conseguir alcanzar este objetivo, la investigación se retrotrae hasta los orígenes del proceso de integración y avanza en la línea temporal hasta nuestros días, analizando con ello las posibles causas que pudieran encontrarse detrás de la crisis existencial actual. A su vez, , la investigación estudia con detenimiento los múltiples efectos que está generando la problemática existencial en los Estados miembros en los últimos años, tales como el aumento del apoyo social a favor de actores considerados populistas o el fenómeno de la creciente desafección ciudadana. Esta investigación analiza los distintos escenarios propuestos por la Comisión Europea en su Libro Blanco sobre el futuro de Europa para determinar cuál podría ser el mejor escenario que concordara con la nueva realidad económica y sociopolítica que actualmente impera en los Estados miembro. Lo que se persigue es formular una propuesta que pueda dar por terminada la crisis existencial de la Comunidad abriendo con ello una nueva etapa en la historia de la integración europea.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple Myeloma (MM) is a hematologic cancer with heterogeneous and complex genomic landscape, where Copy Number Alterations (CNAs) play a key role in the disease's pathogenesis and prognosis. It is of biological and clinical interest to study the temporal occurrence of early alterations, as they play a disease "driver" function by deregulating key tumor pathways. This study presents an innovative bioinformatic tools suite created for harmonizing and tracing the origin of CNAs throughout the evolutionary history of MM. To this aim, large cohorts of newly-diagnosed MM (NDMM, N=1582) and Smoldering-MM (SMM, N=282) were aggregated. The tools developed in this study enable the harmonization of CNAs as obtained from different genomic platforms in such a way that a high statistical power can be obtained. By doing so, the high numerosity of those cohorts was harnessed for the identification of novel genes characterized as "driver" (NFKB2, NOTCH2, MAX, EVI5 and MYC-ME2-enhancer), and the generation of an innovative timing model, implemented with a statistical method to introduce confidence intervals in the CNAs-calls. By applying this model on both NDMM and SMM cohorts, it was possible to identify specific CNAs (1q(CKS1B)amp, 13q(RB1)del, 11q(CCND1)amp and 14q(MAX)del) and categorize them as "early"/ "driver" events. A high level of precision was guaranteed by the narrow confidence intervals in the timing estimates. These CNAs were proposed as critical MM alterations, which play a foundational role in the evolutionary history of both SMM and NDMM. Finally, a multivariate survival model was able to identify the independent genomic alterations with the greatest effect on patients’ survival, including RB1-del, CKS1B-amp, MYC-amp, NOTCH2-amp and TRAF3-del/mut. In conclusion, the alterations that were identified as both "early-drivers” and correlated with patients’ survival were proposed as biomarkers that, if included in wider survival models, could provide a better disease stratification and an improved prognosis definition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Thesis explores two novel and independent cosmological probes, Cosmic Chronometers (CCs) and Gravitational Waves (GWs), to measure the expansion history of the Universe. CCs provide direct and cosmology-independent measurements of the Hubble parameter H(z) up to z∼2. In parallel, GWs provide a direct measurement of the luminosity distance without requiring additional calibration, thus yielding a direct measurement of the Hubble constant H0=H(z=0). This Thesis extends the methodologies of both of these probes to maximize their scientific yield. This is achieved by accounting for the interplay of cosmological and astrophysical parameters to derive them jointly, study possible degeneracies, and eventually minimize potential systematic effects. As a legacy value, this work also provides interesting insights into galaxy evolution and compact binary population properties. The first part presents a detailed study of intermediate-redshift passive galaxies as CCs, with a focus on the selection process and the study of their stellar population properties using specific spectral features. From their differential aging, we derive a new measurement of the Hubble parameter H(z) and thoroughly assess potential systematics. In the second part, we develop a novel methodology and pipeline to obtain joint cosmological and astrophysical population constraints using GWs in combination with galaxy catalogs. This is applied to GW170817 to obtain a measurement of H0. We then perform realistic forecasts to predict joint cosmological and astrophysical constraints from black hole binary mergers for upcoming gravitational wave observatories and galaxy surveys. Using these two probes we provide an independent reconstruction of H(z) with direct measurements of H0 from GWs and H(z) up to z∼2 from CCs and demonstrate that they can be powerful independent probes to unveil the expansion history of the Universe.