964 resultados para Relation quantitative structure-propri


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. The angular diameter distances toward galaxy clusters can be determined with measurements of Sunyaev-Zel'dovich effect and X-ray surface brightness combined with the validity of the distance-duality relation, D-L(z)(1 + z)(2)/D-A(z) = 1, where D-L(z) and D-A(z) are, respectively, the luminosity and angular diameter distances. This combination enables us to probe galaxy cluster physics or even to test the validity of the distance-duality relation itself. Aims. We explore these possibilities based on two different, but complementary approaches. Firstly, in order to constrain the possible galaxy cluster morphologies, the validity of the distance-duality relation (DD relation) is assumed in the Lambda CDM framework (WMAP7). Secondly, by adopting a cosmological-model-independent test, we directly confront the angular diameters from galaxy clusters with two supernovae Ia (SNe Ia) subsamples (carefully chosen to coincide with the cluster positions). The influence of the different SNe Ia light-curve fitters in the previous analysis are also discussed. Methods. We assumed that eta is a function of the redshift parametrized by two different relations: eta(z) = 1 +eta(0)z, and eta(z) = 1 + eta(0)z/(1 + z), where eta(0) is a constant parameter quantifying the possible departure from the strict validity of the DD relation. In order to determine the probability density function (PDF) of eta(0), we considered the angular diameter distances from galaxy clusters recently studied by two different groups by assuming elliptical and spherical isothermal beta models and spherical non-isothermal beta model. The strict validity of the DD relation will occur only if the maximum value of eta(0) PDF is centered on eta(0) = 0. Results. For both approaches we find that the elliptical beta model agrees with the distance-duality relation, whereas the non-isothermal spherical description is, in the best scenario, only marginally compatible. We find that the two-light curve fitters (SALT2 and MLCS2K2) present a statistically significant conflict, and a joint analysis involving the different approaches suggests that clusters are endowed with an elliptical geometry as previously assumed. Conclusions. The statistical analysis presented here provides new evidence that the true geometry of clusters is elliptical. In principle, it is remarkable that a local property such as the geometry of galaxy clusters might be constrained by a global argument like the one provided by the cosmological distance-duality relation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The objective of this paper is to characterise the transactions between European buyers and Brazilian mango and grape producers. Design/methodology/approach - The method selected for this paper was multiple case studies. The Brazilian mango and grape supply chains' export activities to Europe were investigated. The field research was undertaken in Brazil, Germany, The Netherlands and the UK. In total, 41 face-to-face interviews were carried out. Findings - The supermarkets' literature tends to generalise the strategies of retailers focusing on differentiation and preferred suppliers. However, in empirical research conducted in the UK, Germany and The Netherlands it is possible to conclude that the procurement strategies of supermarkets can vary sharply. The results reveal the presence of different agents who demand different quality standards. The level of intensity depends on consumer behaviour, the features of product commercialised and the characteristics of the production segment in each country. Research limitations/implications - First, in relation to the empirical method there is a limitation because the case study does not allow statistical generalisation. Consequently, it will be interesting to undertake quantitative research in order to quantify the variables presented and their impact on the structure of value chains. Second, the research focuses only on two stages of the supply chain, producers and buyers. Practical implications - The differences between UK and German supermarkets challenge the supermarket literature, which tends to generalise the strategies of retailers focusing on differentiation and preferred suppliers. Originality/value - The study shows that the issue of influence and activities of retail agents along the value chain can be analysed taking several variables into consideration: the products commercialised; the distribution segment; and the consumer market. This result opens the way for analysing different structures of the value chain and the impact of these differences on the entry of producers for developing countries into the global market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312–316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth’s geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Type Ia supernovae have been successfully used as standardized candles to study the expansion history of the Universe. In the past few years, these studies led to the exciting result of an accelerated expansion caused by the repelling action of some sort of dark energy. This result has been confirmed by measurements of cosmic microwave background radiation, the large-scale structure, and the dynamics of galaxy clusters. The combination of all these experiments points to a “concordance model” of the Universe with flat large-scale geometry and a dominant component of dark energy. However, there are several points related to supernova measurements which need careful analysis in order to doubtlessly establish the validity of the concordance model. As the amount and quality of data increases, the need of controlling possible systematic effects which may bias the results becomes crucial. Also important is the improvement of our knowledge of the physics of supernovae events to assure and possibly refine their calibration as standardized candle. This thesis addresses some of those issues through the quantitative analysis of supernova spectra. The stress is put on a careful treatment of the data and on the definition of spectral measurement methods. The comparison of measurements for a large set of spectra from nearby supernovae is used to study the homogeneity and to search for spectral parameters which may further refine the calibration of the standardized candle. One such parameter is found to reduce the dispersion in the distance estimation of a sample of supernovae to below 6%, a precision which is comparable with the current lightcurve-based calibration, and is obtained in an independent manner. Finally, the comparison of spectral measurements from nearby and distant objects is used to test the possibility of evolution with cosmic time of the intrinsic brightness of type Ia supernovae.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction 1.1 Occurrence of polycyclic aromatic hydrocarbons (PAH) in the environment Worldwide industrial and agricultural developments have released a large number of natural and synthetic hazardous compounds into the environment due to careless waste disposal, illegal waste dumping and accidental spills. As a result, there are numerous sites in the world that require cleanup of soils and groundwater. Polycyclic aromatic hydrocarbons (PAHs) are one of the major groups of these contaminants (Da Silva et al., 2003). PAHs constitute a diverse class of organic compounds consisting of two or more aromatic rings with various structural configurations (Prabhu and Phale, 2003). Being a derivative of benzene, PAHs are thermodynamically stable. In addition, these chemicals tend to adhere to particle surfaces, such as soils, because of their low water solubility and strong hydrophobicity, and this results in greater persistence under natural conditions. This persistence coupled with their potential carcinogenicity makes PAHs problematic environmental contaminants (Cerniglia, 1992; Sutherland, 1992). PAHs are widely found in high concentrations at many industrial sites, particularly those associated with petroleum, gas production and wood preserving industries (Wilson and Jones, 1993). 1.2 Remediation technologies Conventional techniques used for the remediation of soil polluted with organic contaminants include excavation of the contaminated soil and disposal to a landfill or capping - containment - of the contaminated areas of a site. These methods have some drawbacks. The first method simply moves the contamination elsewhere and may create significant risks in the excavation, handling and transport of hazardous material. Additionally, it is very difficult and increasingly expensive to find new landfill sites for the final disposal of the material. The cap and containment method is only an interim solution since the contamination remains on site, requiring monitoring and maintenance of the isolation barriers long into the future, with all the associated costs and potential liability. A better approach than these traditional methods is to completely destroy the pollutants, if possible, or transform them into harmless substances. Some technologies that have been used are high-temperature incineration and various types of chemical decomposition (for example, base-catalyzed dechlorination, UV oxidation). However, these methods have significant disadvantages, principally their technological complexity, high cost , and the lack of public acceptance. Bioremediation, on the contrast, is a promising option for the complete removal and destruction of contaminants. 1.3 Bioremediation of PAH contaminated soil & groundwater Bioremediation is the use of living organisms, primarily microorganisms, to degrade or detoxify hazardous wastes into harmless substances such as carbon dioxide, water and cell biomass Most PAHs are biodegradable unter natural conditions (Da Silva et al., 2003; Meysami and Baheri, 2003) and bioremediation for cleanup of PAH wastes has been extensively studied at both laboratory and commercial levels- It has been implemented at a number of contaminated sites, including the cleanup of the Exxon Valdez oil spill in Prince William Sound, Alaska in 1989, the Mega Borg spill off the Texas coast in 1990 and the Burgan Oil Field, Kuwait in 1994 (Purwaningsih, 2002). Different strategies for PAH bioremediation, such as in situ , ex situ or on site bioremediation were developed in recent years. In situ bioremediation is a technique that is applied to soil and groundwater at the site without removing the contaminated soil or groundwater, based on the provision of optimum conditions for microbiological contaminant breakdown.. Ex situ bioremediation of PAHs, on the other hand, is a technique applied to soil and groundwater which has been removed from the site via excavation (soil) or pumping (water). Hazardous contaminants are converted in controlled bioreactors into harmless compounds in an efficient manner. 1.4 Bioavailability of PAH in the subsurface Frequently, PAH contamination in the environment is occurs as contaminants that are sorbed onto soilparticles rather than in phase (NAPL, non aqueous phase liquids). It is known that the biodegradation rate of most PAHs sorbed onto soil is far lower than rates measured in solution cultures of microorganisms with pure solid pollutants (Alexander and Scow, 1989; Hamaker, 1972). It is generally believed that only that fraction of PAHs dissolved in the solution can be metabolized by microorganisms in soil. The amount of contaminant that can be readily taken up and degraded by microorganisms is defined as bioavailability (Bosma et al., 1997; Maier, 2000). Two phenomena have been suggested to cause the low bioavailability of PAHs in soil (Danielsson, 2000). The first one is strong adsorption of the contaminants to the soil constituents which then leads to very slow release rates of contaminants to the aqueous phase. Sorption is often well correlated with soil organic matter content (Means, 1980) and significantly reduces biodegradation (Manilal and Alexander, 1991). The second phenomenon is slow mass transfer of pollutants, such as pore diffusion in the soil aggregates or diffusion in the organic matter in the soil. The complex set of these physical, chemical and biological processes is schematically illustrated in Figure 1. As shown in Figure 1, biodegradation processes are taking place in the soil solution while diffusion processes occur in the narrow pores in and between soil aggregates (Danielsson, 2000). Seemingly contradictory studies can be found in the literature that indicate the rate and final extent of metabolism may be either lower or higher for sorbed PAHs by soil than those for pure PAHs (Van Loosdrecht et al., 1990). These contrasting results demonstrate that the bioavailability of organic contaminants sorbed onto soil is far from being well understood. Besides bioavailability, there are several other factors influencing the rate and extent of biodegradation of PAHs in soil including microbial population characteristics, physical and chemical properties of PAHs and environmental factors (temperature, moisture, pH, degree of contamination). Figure 1: Schematic diagram showing possible rate-limiting processes during bioremediation of hydrophobic organic contaminants in a contaminated soil-water system (not to scale) (Danielsson, 2000). 1.5 Increasing the bioavailability of PAH in soil Attempts to improve the biodegradation of PAHs in soil by increasing their bioavailability include the use of surfactants , solvents or solubility enhancers.. However, introduction of synthetic surfactant may result in the addition of one more pollutant. (Wang and Brusseau, 1993).A study conducted by Mulder et al. showed that the introduction of hydropropyl-ß-cyclodextrin (HPCD), a well-known PAH solubility enhancer, significantly increased the solubilization of PAHs although it did not improve the biodegradation rate of PAHs (Mulder et al., 1998), indicating that further research is required in order to develop a feasible and efficient remediation method. Enhancing the extent of PAHs mass transfer from the soil phase to the liquid might prove an efficient and environmentally low-risk alternative way of addressing the problem of slow PAH biodegradation in soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extensive literature outlined that the quality of the mother-foetus relationship is considered the main feature with regard to the quality of postnatal mother-infant interaction and also to the child’s psychical development. Nowadays the relationship between the pregnant woman and her foetus is viewed as the central factor of the somatic dialogue between the functioning of the maternal and the foetal organisms. This dialogue is responsible for the physic development of the child, as well as of its psychosomatic structure. Therefore the research area has necessarily had to extend to the analysis of psychological processes concerning: the pregnancy, the couple that is bound by parenthood, the influence of intergenerational dynamics. In fact, the formation of maternal identity, as well as that of the relationship between the woman and the foetus, refers to the pregnant woman’s relationship with her parents, especially with her mother. The same pregnancy, considered as a psychosomatic event, is directly influenced by relational, affective and social factors, particularly by the quality of the interiorized parental relations and the quality of the current relationships (such as that with her partner and with her family of origin). Some studies have begun to investigate the relationship between the pregnant woman and the foetus in term of “prenatal attachment” and its relationship with socio-demographic, psychological e psychopathological aspects (such as pre and post partum depression), but the research area is still largely unexplored. The present longitudinal research aimed to investigate the quality of the pregnant womanfoetus relationship by the prenatal attachment index, the quality of the interiorized relationship with woman’s parents, the level of alexithymic features and maternity social support, in relation with the modulation of the physiology of delivery and of postpartum, as well as of the physical development of the child. A consecutive sample of 62 Italian primipara women without any kind of pathologies, participated in the longitudinal study. In the first phase of this study (third trimester of the pregnancy), it has investigated the psychological processes connected to the affective investment of the pregnant women towards the unborn baby (by Prenatal Attachment Inventory), the mothers’ interiorized relationship with their own parents (by Parental Bonding Instrument), the social and affective support from their partner and their family of origin are able to supply (by Maternity Social Support Scale), and the level of alexithymia (by 20-Toronto Alexithymia Scale). In the second phase of this study, some data concerning the childbirth course carried out from a “deliverygram” (such as labour, induction durations and modalities of delivery) and data relative to the newborns state of well-being (such as Apgar and pH indexes). Finally, in the third phase of the study women have been telephoned a month after the childbirth. The semistructured interview investigated the following areas: the memory concerning the delivery, the return to home, the first interactions between the mother and the newborn, the breastfeeding, the biological rhythms achieved from newborns. From the data analysis a sample with a good level of prenatal attachment and of support social and a good capability to mental functioning emerged. An interesting result is that the most of the women have a great percentage of “affectionless control style” with both parents, but data is not sufficient to interpret this result. Moreover, considering the data relative to the delivery, medical and welfare procedures, that have been necessary, are coherent with the Italian mean, while the percentage of the caesarean section (12.9%) is inferior to the national percentage (30%). The 29% of vaginal partum has got epidural analgesia, which explains the high number (37%) of obstetrician operations (such as Kristeller). The data relative to the newborn (22 male, 40 female) indicates a good state of well-being because Apgar and pH indexes are superior to 7 at first and fifth minutes. Concerning the prenatal phase, correlation analysis showed that: the prenatal attachment scores positively correlated with the expected social support and negatively correlated with the “externally oriented thinking” dimension of alexithymia; the maternity social support negatively correlated with total alexithymia score, particularly with the “externally oriented thinking” dimension, and negatively correlated with maternal control of parental bonding. Concerning the delivery data, there are many correlations (after all obvious) among themselves. The most important are that the labour duration negatively correlated with the newborn’s index state of well-being. Finally, concerning the data from the postpartum phase the women’ assessments relative to the partum negatively correlated with the duration of the delivery and positively correlated with the assessment relative to the return to home and the interaction with the newborn. Moreover the length of permanence in the hospital negatively correlated with women’s assessments relative to the return to home that, in turn, positively correlated with the quality of breastfeeding, the interaction between the mother and the newborn and the biological regulation of the child. Finally, the women’ assessments relative to breastfeeding positively correlated with the mother-child interactions and the biological rhythms of children. From the correlation analysis between the variables of the prenatal phase and the data relative to the delivery, emerged that the prenatal attachment scores positively correlated with the dilatation stage scores and with the newborn’s Apgar index at first minute, the paternal care dimension of parental bonding positively correlated with the lengths of the various periods of childbirth like so the paternal control dimension with placental stage. Moreover, emerged that the expected social support positively correlated with the lengths of the various periods of childbirth and that the global alexithymia scores, particularly “difficulty to describe emotions” dimension, negatively correlated with total childbirth scores. From the correlation analysis between the variables of the prenatal phase and variable of the postpartum phase emerged that the total alexithymia scores positively correlated with the time elapsed from the childbirth to the breastfeeding of the child, the difficulty to describe emotions dimension of the alexithymia negatively correlated with the quality of the breastfeeding, the “externally oriented thinking” dimension of the alexithymia negatively correlated with mother-child interactions, and finally the paternal control dimension of the parental bonding negatively correlated with the time elapsed from the child to the breastfeeding of the child. Finally, from the analysis of the correlation between the data of the partum and the women’s assessments of the postpartum phase, emerged the negative correlation between the woman’s assessment relative to the delivery and the quantitative of obstetrician operations and the lengths of the various periods of childbirth, the positive correlation between the women’s assessment about the length of delivery periods and the real lengths of the same ones, the positive relation between woman’s assessment relative to the delivery and the Apgar index of children. In conclusion, there is a remarkable relation between the quality of the relationship the woman establishes with the foetus that influences the course of the pregnancy and the delivery that, in turn, influences the postpartum outcome, particularly relative to the mother-children relationship. Such data should be confirmed by heterogeneous populations in order to identify vulnerable women and to project focused intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se il lavoro dello storico è capire il passato come è stato compreso dalla gente che lo ha vissuto, allora forse non è azzardato pensare che sia anche necessario comunicare i risultati delle ricerche con strumenti propri che appartengono a un'epoca e che influenzano la mentalità di chi in quell'epoca vive. Emergenti tecnologie, specialmente nell’area della multimedialità come la realtà virtuale, permettono agli storici di comunicare l’esperienza del passato in più sensi. In che modo la storia collabora con le tecnologie informatiche soffermandosi sulla possibilità di fare ricostruzioni storiche virtuali, con relativi esempi e recensioni? Quello che maggiormente preoccupa gli storici è se una ricostruzione di un fatto passato vissuto attraverso la sua ricreazione in pixels sia un metodo di conoscenza della storia che possa essere considerato valido. Ovvero l'emozione che la navigazione in una realtà 3D può suscitare, è un mezzo in grado di trasmettere conoscenza? O forse l'idea che abbiamo del passato e del suo studio viene sottilmente cambiato nel momento in cui lo si divulga attraverso la grafica 3D? Da tempo però la disciplina ha cominciato a fare i conti con questa situazione, costretta soprattutto dall'invasività di questo tipo di media, dalla spettacolarizzazione del passato e da una divulgazione del passato parziale e antiscientifica. In un mondo post letterario bisogna cominciare a pensare che la cultura visuale nella quale siamo immersi sta cambiando il nostro rapporto con il passato: non per questo le conoscenze maturate fino ad oggi sono false, ma è necessario riconoscere che esiste più di una verità storica, a volte scritta a volte visuale. Il computer è diventato una piattaforma onnipresente per la rappresentazione e diffusione dell’informazione. I metodi di interazione e rappresentazione stanno evolvendo di continuo. Ed è su questi due binari che è si muove l’offerta delle tecnologie informatiche al servizio della storia. Lo scopo di questa tesi è proprio quello di esplorare, attraverso l’utilizzo e la sperimentazione di diversi strumenti e tecnologie informatiche, come si può raccontare efficacemente il passato attraverso oggetti tridimensionali e gli ambienti virtuali, e come, nel loro essere elementi caratterizzanti di comunicazione, in che modo possono collaborare, in questo caso particolare, con la disciplina storica. La presente ricerca ricostruisce alcune linee di storia delle principali fabbriche attive a Torino durante la seconda guerra mondiale, ricordando stretta relazione che esiste tra strutture ed individui e in questa città in particolare tra fabbrica e movimento operaio, è inevitabile addentrarsi nelle vicende del movimento operaio torinese che nel periodo della lotta di Liberazione in città fu un soggetto politico e sociale di primo rilievo. Nella città, intesa come entità biologica coinvolta nella guerra, la fabbrica (o le fabbriche) diventa il nucleo concettuale attraverso il quale leggere la città: sono le fabbriche gli obiettivi principali dei bombardamenti ed è nelle fabbriche che si combatte una guerra di liberazione tra classe operaia e autorità, di fabbrica e cittadine. La fabbrica diventa il luogo di "usurpazione del potere" di cui parla Weber, il palcoscenico in cui si tengono i diversi episodi della guerra: scioperi, deportazioni, occupazioni .... Il modello della città qui rappresentata non è una semplice visualizzazione ma un sistema informativo dove la realtà modellata è rappresentata da oggetti, che fanno da teatro allo svolgimento di avvenimenti con una precisa collocazione cronologica, al cui interno è possibile effettuare operazioni di selezione di render statici (immagini), di filmati precalcolati (animazioni) e di scenari navigabili interattivamente oltre ad attività di ricerca di fonti bibliografiche e commenti di studiosi segnatamente legati all'evento in oggetto. Obiettivo di questo lavoro è far interagire, attraverso diversi progetti, le discipline storiche e l’informatica, nelle diverse opportunità tecnologiche che questa presenta. Le possibilità di ricostruzione offerte dal 3D vengono così messe a servizio della ricerca, offrendo una visione integrale in grado di avvicinarci alla realtà dell’epoca presa in considerazione e convogliando in un’unica piattaforma espositiva tutti i risultati. Divulgazione Progetto Mappa Informativa Multimediale Torino 1945 Sul piano pratico il progetto prevede una interfaccia navigabile (tecnologia Flash) che rappresenti la pianta della città dell’epoca, attraverso la quale sia possibile avere una visione dei luoghi e dei tempi in cui la Liberazione prese forma, sia a livello concettuale, sia a livello pratico. Questo intreccio di coordinate nello spazio e nel tempo non solo migliora la comprensione dei fenomeni, ma crea un maggiore interesse sull’argomento attraverso l’utilizzo di strumenti divulgativi di grande efficacia (e appeal) senza perdere di vista la necessità di valicare le tesi storiche proponendosi come piattaforma didattica. Un tale contesto richiede uno studio approfondito degli eventi storici al fine di ricostruire con chiarezza una mappa della città che sia precisa sia topograficamente sia a livello di navigazione multimediale. La preparazione della cartina deve seguire gli standard del momento, perciò le soluzioni informatiche utilizzate sono quelle fornite da Adobe Illustrator per la realizzazione della topografia, e da Macromedia Flash per la creazione di un’interfaccia di navigazione. La base dei dati descrittivi è ovviamente consultabile essendo contenuta nel supporto media e totalmente annotata nella bibliografia. È il continuo evolvere delle tecnologie d'informazione e la massiccia diffusione dell’uso dei computer che ci porta a un cambiamento sostanziale nello studio e nell’apprendimento storico; le strutture accademiche e gli operatori economici hanno fatto propria la richiesta che giunge dall'utenza (insegnanti, studenti, operatori dei Beni Culturali) di una maggiore diffusione della conoscenza storica attraverso la sua rappresentazione informatizzata. Sul fronte didattico la ricostruzione di una realtà storica attraverso strumenti informatici consente anche ai non-storici di toccare con mano quelle che sono le problematiche della ricerca quali fonti mancanti, buchi della cronologia e valutazione della veridicità dei fatti attraverso prove. Le tecnologie informatiche permettono una visione completa, unitaria ed esauriente del passato, convogliando tutte le informazioni su un'unica piattaforma, permettendo anche a chi non è specializzato di comprendere immediatamente di cosa si parla. Il miglior libro di storia, per sua natura, non può farlo in quanto divide e organizza le notizie in modo diverso. In questo modo agli studenti viene data l'opportunità di apprendere tramite una rappresentazione diversa rispetto a quelle a cui sono abituati. La premessa centrale del progetto è che i risultati nell'apprendimento degli studenti possono essere migliorati se un concetto o un contenuto viene comunicato attraverso più canali di espressione, nel nostro caso attraverso un testo, immagini e un oggetto multimediale. Didattica La Conceria Fiorio è uno dei luoghi-simbolo della Resistenza torinese. Il progetto è una ricostruzione in realtà virtuale della Conceria Fiorio di Torino. La ricostruzione serve a arricchire la cultura storica sia a chi la produce, attraverso una ricerca accurata delle fonti, sia a chi può poi usufruirne, soprattutto i giovani, che, attratti dall’aspetto ludico della ricostruzione, apprendono con più facilità. La costruzione di un manufatto in 3D fornisce agli studenti le basi per riconoscere ed esprimere la giusta relazione fra il modello e l’oggetto storico. Le fasi di lavoro attraverso cui si è giunti alla ricostruzione in 3D della Conceria: . una ricerca storica approfondita, basata sulle fonti, che possono essere documenti degli archivi o scavi archeologici, fonti iconografiche, cartografiche, ecc.; . La modellazione degli edifici sulla base delle ricerche storiche, per fornire la struttura geometrica poligonale che permetta la navigazione tridimensionale; . La realizzazione, attraverso gli strumenti della computer graphic della navigazione in 3D. Unreal Technology è il nome dato al motore grafico utilizzato in numerosi videogiochi commerciali. Una delle caratteristiche fondamentali di tale prodotto è quella di avere uno strumento chiamato Unreal editor con cui è possibile costruire mondi virtuali, e che è quello utilizzato per questo progetto. UnrealEd (Ued) è il software per creare livelli per Unreal e i giochi basati sul motore di Unreal. E’ stata utilizzata la versione gratuita dell’editor. Il risultato finale del progetto è un ambiente virtuale navigabile raffigurante una ricostruzione accurata della Conceria Fiorio ai tempi della Resistenza. L’utente può visitare l’edificio e visualizzare informazioni specifiche su alcuni punti di interesse. La navigazione viene effettuata in prima persona, un processo di “spettacolarizzazione” degli ambienti visitati attraverso un arredamento consono permette all'utente una maggiore immersività rendendo l’ambiente più credibile e immediatamente codificabile. L’architettura Unreal Technology ha permesso di ottenere un buon risultato in un tempo brevissimo, senza che fossero necessari interventi di programmazione. Questo motore è, quindi, particolarmente adatto alla realizzazione rapida di prototipi di una discreta qualità, La presenza di un certo numero di bug lo rende, però, in parte inaffidabile. Utilizzare un editor da videogame per questa ricostruzione auspica la possibilità di un suo impiego nella didattica, quello che le simulazioni in 3D permettono nel caso specifico è di permettere agli studenti di sperimentare il lavoro della ricostruzione storica, con tutti i problemi che lo storico deve affrontare nel ricreare il passato. Questo lavoro vuole essere per gli storici una esperienza nella direzione della creazione di un repertorio espressivo più ampio, che includa gli ambienti tridimensionali. Il rischio di impiegare del tempo per imparare come funziona questa tecnologia per generare spazi virtuali rende scettici quanti si impegnano nell'insegnamento, ma le esperienze di progetti sviluppati, soprattutto all’estero, servono a capire che sono un buon investimento. Il fatto che una software house, che crea un videogame di grande successo di pubblico, includa nel suo prodotto, una serie di strumenti che consentano all'utente la creazione di mondi propri in cui giocare, è sintomatico che l'alfabetizzazione informatica degli utenti medi sta crescendo sempre più rapidamente e che l'utilizzo di un editor come Unreal Engine sarà in futuro una attività alla portata di un pubblico sempre più vasto. Questo ci mette nelle condizioni di progettare moduli di insegnamento più immersivi, in cui l'esperienza della ricerca e della ricostruzione del passato si intreccino con lo studio più tradizionale degli avvenimenti di una certa epoca. I mondi virtuali interattivi vengono spesso definiti come la forma culturale chiave del XXI secolo, come il cinema lo è stato per il XX. Lo scopo di questo lavoro è stato quello di suggerire che vi sono grosse opportunità per gli storici impiegando gli oggetti e le ambientazioni in 3D, e che essi devono coglierle. Si consideri il fatto che l’estetica abbia un effetto sull’epistemologia. O almeno sulla forma che i risultati delle ricerche storiche assumono nel momento in cui devono essere diffuse. Un’analisi storica fatta in maniera superficiale o con presupposti errati può comunque essere diffusa e avere credito in numerosi ambienti se diffusa con mezzi accattivanti e moderni. Ecco perchè non conviene seppellire un buon lavoro in qualche biblioteca, in attesa che qualcuno lo scopra. Ecco perchè gli storici non devono ignorare il 3D. La nostra capacità, come studiosi e studenti, di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio che il 3D porta con sè, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Una ricostruzione storica può essere molto utile dal punto di vista educativo non sono da chi la visita ma, anche da chi la realizza. La fase di ricerca necessaria per la ricostruzione non può fare altro che aumentare il background culturale dello sviluppatore. Conclusioni La cosa più importante è stata la possibilità di fare esperienze nell’uso di mezzi di comunicazione di questo genere per raccontare e far conoscere il passato. Rovesciando il paradigma conoscitivo che avevo appreso negli studi umanistici, ho cercato di desumere quelle che potremo chiamare “leggi universali” dai dati oggettivi emersi da questi esperimenti. Da punto di vista epistemologico l’informatica, con la sua capacità di gestire masse impressionanti di dati, dà agli studiosi la possibilità di formulare delle ipotesi e poi accertarle o smentirle tramite ricostruzioni e simulazioni. Il mio lavoro è andato in questa direzione, cercando conoscere e usare strumenti attuali che nel futuro avranno sempre maggiore presenza nella comunicazione (anche scientifica) e che sono i mezzi di comunicazione d’eccellenza per determinate fasce d’età (adolescenti). Volendo spingere all’estremo i termini possiamo dire che la sfida che oggi la cultura visuale pone ai metodi tradizionali del fare storia è la stessa che Erodoto e Tucidide contrapposero ai narratori di miti e leggende. Prima di Erodoto esisteva il mito, che era un mezzo perfettamente adeguato per raccontare e dare significato al passato di una tribù o di una città. In un mondo post letterario la nostra conoscenza del passato sta sottilmente mutando nel momento in cui lo vediamo rappresentato da pixel o quando le informazioni scaturiscono non da sole, ma grazie all’interattività con il mezzo. La nostra capacità come studiosi e studenti di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio sottinteso al 3D, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Le esperienze raccolte nelle pagine precedenti ci portano a pensare che in un futuro non troppo lontano uno strumento come il computer sarà l’unico mezzo attraverso cui trasmettere conoscenze, e dal punto di vista didattico la sua interattività consente coinvolgimento negli studenti come nessun altro mezzo di comunicazione moderno.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Beim Beobachten von nächtlichen Insektenschwärmen an Straßenbeleuchtungen kommt es zur Frage, inwieweit von einer Bedrohung eines geordneten Naturkreislaufes durch dieses Phänomen gesprochen werden kann. Am Beispiel der bedrohten Artenvielfalt aquatischer Insekten im Bereich anthropogen bedingt verschmutzter Fließgewässer wurde die Problematik umfassend untersucht. Die Freilandtests erstreckten sich von Sommer 1998 bis Sommer 2000. Die vorliegende Arbeit geht mehreren Fragestellungen nach:1. - Wie hoch ist der Anteil aquatischer Insekten, der von dem Ort seiner Emergenz aus von einer naheliegenden Straßenlaterne angezogen wird ?2. - Welcher Lampentyp lockt weniger Insekten an: OSRAM HQL (Weiß-Mischlicht) oder PHILIPPS SON (gelbes Licht) ?3. - Welche Wellenlängenbereiche des Lichts sind beim Anflug an die Lampe von besonderer Präferenz ? zu 1. - Aquatische Insekten wiesen kein einheitliches Verhalten im Anflug an künstliches Licht auf. In den Sommermonaten kam es bei einigen Insektengruppen, wie z.B. den Trichopteren, zu einem massenhaften Anflug an die eine, im Untersuchungsgebiet aufgestellte Straßenlaterne. In dieser Zeit ergaben sich im Mittelwert Fangzahlen in einer Nacht am Licht, die der Emergenz von 25 Metern Bachufer/72 Stunden entsprachen. Bei den Dipteren ergab der mittlere Wert eine Emergenz von knapp 10 Metern Uferlänge/72 Stunden. Aufgrund hoher Fangzahlen konnten folgende Ergebnisse auf Artniveau bei den Chironomiden erzielt werden. Bei den zehn häufigsten Arten aus Emergenz und Lichtfang zeigten sich starke Schwankungen: Der Fang am Licht entsprach im Maximum einer Schlupfrate von 61 Metern Ufer/ 72 Stunden bei einer Art, bei einer anderen Art wurde z.B. ein Lichtfang erzielt, der lediglich die Emergenz von 0,3 Metern/72 Stunden umfaßte.zu 2. - In dem ohne Lichtkonkurrenz stattfindenden Vergleich zwischen PHILIPPS SON 70W und OSRAM HQL 125W ergab sich eine Fangrelation von 1:1,6 (SON:HQL). Bei einem Parallelfang (30 Metern Abstand der Leuchten) mit SON/HQL flogen wesentlich mehr Tiere an die HQL: Hier betrug die Relation 1:3,97 (SON:HQL). Es fand somit bei Lichtkonkurrenz eine Abwanderung der Insekten zur 'attraktiveren' Lampe statt.zu 3. - Zuletzt wurde die Anlockwirkung dreier Farbspektren mit den Intensitätsmaxima einer Wellenlänge von 437nm, 579nm und 599nm getestet. 437nm war die in der HQL gemessene Intensitätsspitze im niedrigwelligen Bereich und nach verbreiteter Auffassung von besonderer Anlockwirkung. Der Wellenlängenbereich um 579nm stellte das Intensitätsmaximum der SON-Lampe dar (gelbes Licht); 599nm war als Alternative für anlockschwache Beleuchtungen ausgewählt worden. Hier ergab sich bei abwechselndem Fang (ohne Lichtkonkurrenz) eine Fangrelation von 1,8 : 3,4 : 1 (437nm : 579nm : 599nm).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food technologies today mean reducing agricultural food waste, improvement of food security, enhancement of food sensory properties, enlargement of food market and food economies. Food technologists must be high-skilled technicians with good scientific knowledge of food hygiene, food chemistry, industrial technologies and food engineering, sensory evaluation experience and analytical chemistry. Their role is to apply the modern vision of science in the field of human nutrition, rising up knowledge in food science. The present PhD project starts with the aim of studying and improving frozen fruits quality. Freezing process in very powerful in preserve initial raw material characteristics, but pre-treatment before the freezing process are necessary to improve quality, in particular to improve texture and enzymatic activity of frozen foods. Osmotic Dehydration (OD) and Vacuum Impregnation (VI), are useful techniques to modify fruits and vegetables composition and prepare them to freezing process. These techniques permit to introduce cryo-protective agent into the food matrices, without significant changes of the original structure, but cause a slight leaching of important intrinsic compounds. Phenolic and polyphenolic compounds for example in apples and nectarines treated with hypertonic solutions are slightly decreased, but the effect of concentration due to water removal driven out from the osmotic gradient, cause a final content of phenolic compounds similar to that of the raw material. In many experiment, a very important change in fruit composition regard the aroma profile. This occur in strawberries osmo-dehydrated under vacuum condition or under atmospheric pressure condition. The increment of some volatiles, probably due to fermentative metabolism induced by the osmotic stress of hypertonic treatment, induce a sensory profile modification of frozen fruits, that in some way result in a better acceptability of consumer, that prefer treated frozen fruits to untreated frozen fruits. Among different processes used, a very interesting result was obtained with the application of a osmotic pre-treatment driven out at refrigerated temperature for long time. The final quality of frozen strawberries was very high and a peculiar increment of phenolic profile was detected. This interesting phenomenon was probably due to induction of phenolic biological synthesis (for example as reaction to osmotic stress), or to hydrolysis of polymeric phenolic compounds. Aside this investigation in the cryo-stabilization and dehydrofreezing of fruits, deeper investigation in VI techniques were carried out, as studies of changes in vacuum impregnated prickly pear texture, and in use of VI and ultrasound (US) in aroma enrichment of fruit pieces. Moreover, to develop sensory evaluation tools and analytical chemistry determination (of volatiles and phenolic compounds), some researches were bring off and published in these fields. Specifically dealing with off-flavour development during storage of boiled potato, and capillary zonal electrophoresis (CZE) and high performance liquid chromatography (HPLC) determination of phenolic compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Das Time-of-Flight Aerosol Mass Spectrometer (ToF-AMS) der Firma Aerodyne ist eine Weiterentwicklung des Aerodyne Aerosolmassenspektrometers (Q-AMS). Dieses ist gut charakterisiert und kommt weltweit zum Einsatz. Beide Instrumente nutzen eine aerodynamische Linse, aerodynamische Partikelgrößenbestimmung, thermische Verdampfung und Elektronenstoß-Ionisation. Im Gegensatz zum Q-AMS, wo ein Quadrupolmassenspektrometer zur Analyse der Ionen verwendet wird, kommt beim ToF-AMS ein Flugzeit-Massenspektrometer zum Einsatz. In der vorliegenden Arbeit wird anhand von Laborexperimenten und Feldmesskampagnen gezeigt, dass das ToF-AMS zur quantitativen Messung der chemischen Zusammensetzung von Aerosolpartikeln mit hoher Zeit- und Größenauflösung geeignet ist. Zusätzlich wird ein vollständiges Schema zur ToF-AMS Datenanalyse vorgestellt, dass entwickelt wurde, um quantitative und sinnvolle Ergebnisse aus den aufgenommenen Rohdaten, sowohl von Messkampagnen als auch von Laborexperimenten, zu erhalten. Dieses Schema basiert auf den Charakterisierungsexperimenten, die im Rahmen dieser Arbeit durchgeführt wurden. Es beinhaltet Korrekturen, die angebracht werden müssen, und Kalibrationen, die durchgeführt werden müssen, um zuverlässige Ergebnisse aus den Rohdaten zu extrahieren. Beträchtliche Arbeit wurde außerdem in die Entwicklung eines zuverlässigen und benutzerfreundlichen Datenanalyseprogramms investiert. Dieses Programm kann zur automatischen und systematischen ToF-AMS Datenanalyse und –korrektur genutzt werden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heutzutage gewähren hochpräzise Massenmessungen mit Penning-Fallen tiefe Einblicke in die fundamentalen Eigenschaften der Kernmaterie. Zu diesem Zweck wird die freie Zyklotronfrequenz eines Ions bestimmt, das in einem starken, homogenen Magnetfeld gespeichert ist. Am ISOLTRAP-Massenspektrometer an ISOLDE / CERN können die Massen von kurzlebigen, radioaktiven Nukliden mit Halbwertszeiten bis zu einigen zehn ms mit einer Unsicherheit in der Größenordnung von 10^-8 bestimmt werden. ISOLTRAP besteht aus einem Radiofrequenz-Quadrupol zum akkumulieren der von ISOLDE gelieferten Ionen, sowie zwei Penning-Fallen zum säubern und zur Massenbestimmung der Ionen. Innerhalb dieser Arbeit wurden die Massen von neutronenreichen Xenon- und Radonisotopen (138-146Xe und 223-229Rn) gemessen. Für elf davon wurde zum ersten Mal die Masse direkt bestimmt; 229Rn wurde im Zuge dieses Experimentes sogar erstmalig beobachtet und seine Halbwertszeit konnte zu ungefähr 12 s bestimmt werden. Da die Masse eines Nuklids alle Wechselwirkungen innerhalb des Kerns widerspiegelt, ist sie einzigartig für jedes Nuklid. Eine dieser Wechselwirkungen, die Wechselwirkung zwischen Protonen und Neutronen, führt zum Beispiel zu Deformationen. Das Ziel dieser Arbeit ist eine Verbindung zwischen kollektiven Effekten, wie Deformationen und Doppeldifferenzen von Bindungsenergien, sogenannten deltaVpn-Werten zu finden. Insbesondere in den hier untersuchten Regionen zeigen deltaVpn-Werte ein sehr ungewöhnliches Verhalten, das sich nicht mit einfachen Argumenten deuten lässt. Eine Erklärung könnte das Auftreten von Oktupoldeformationen in diesen Gebieten sein. Nichtsdestotrotz ist eine quantitative Beschreibung von deltaVpn-Werten, die den Effekt von solchen Deformationen berücksichtigt mit modernen Theorien noch nicht möglich.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A thorough investigation was made of the structure-property relation of well-defined statistical, gradient and block copolymers of various compositions. Among the copolymers studied were those which were synthesized using isobornyl acrylate (IBA) and n-butyl acrylate (nBA) monomer units. The copolymers exhibited several unique properties that make them suitable materials for a range of applications. The thermomechanical properties of these new materials were compared to acrylate homopolymers. By the proper choice of the IBA/nBA monomer ratio, it was possible to tune the glass transition temperature of the statistical P(IBA-co-nBA) copolymers. The measured Tg’s of the copolymers with different IBA/nBA monomer ratios followed a trend that fitted well with the Fox equation prediction. While statistical copolymers showed a single glass transition (Tg between -50 and 90 ºC depending on composition), DSC block copolymers showed two Tg’s and the gradient copolymer showed a single, but very broad, glass transition. PMBL-PBA-PMBL triblock copolymers of different composition ratios were also studied and revealed a microphase separated morphology of mostly cylindrical PMBL domains hexagonally arranged in the PBA matrix. DMA studies confirmed the phase separated morphology of the copolymers. Tensile studies showed the linear PMBL-PBA-PMBL triblock copolymers having a relatively low elongation at break that was increased by replacing the PMBL hard blocks with the less brittle random PMBL-r-PMMA blocks. The 10- and 20-arm PBA-PMBL copolymers which were studied revealed even more unique properties. SAXS results showed a mixture of cylindrical PMBL domains hexagonally arranged in the PBA matrix, as well as lamellar. Despite PMBL’s brittleness, the triblock and multi-arm PBA-PMBL copolymers could become suitable materials for high temperature applications due to PMBL’s high glass transition temperature and high thermal stability. The structure-property relation of multi-arm star PBA-PMMA block copolymers was also investigated. Small-angle X-ray scattering revealed a phase separated morphology of cylindrical PMMA domains hexagonally arranged in the PBA matrix. DMA studies found that these materials possess typical elastomeric behavior in a broad range of service temperatures up to at least 250°C. The ultimate tensile strength and the elastic modulus of the 10- and 20-arm star PBA-PMMA block copolymers are significantly higher than those of their 3-arm or linear ABA type counterparts with similar composition, indicating a strong effect of the number of arms on the tensile properties. Siloxane-based copolymers were also studied and one of the main objectives here was to examine the possibility to synthesize trifluoropropyl-containing siloxane copolymers of gradient distribution of trifluoropropyl groups along the chain. DMA results of the PDMS-PMTFPS siloxane copolymers synthesized via simultaneous copolymerization showed that due to the large difference in reactivity rates of 2,4,6-tris(3,3,3-trifluoropropyl)-2,4,6-trimethylcyclotrisiloxane (F) and hexamethylcyclotrisiloxane (D), a copolymer of almost block structure containing only a narrow intermediate fragment with gradient distribution of the component units was obtained. A more dispersed distribution of the trifluoropropyl groups was obtained by the semi-batch copolymerization process, as the DMA results revealed more ‘‘pure gradient type’’ features for the siloxane copolymers which were synthesized by adding F at a controlled rate to the polymerization of the less reactive D. As with trifluoropropyl-containing siloxane copolymers, vinyl-containing polysiloxanes may be converted to a variety of useful polysiloxane materials by chemical modification. But much like the trifluoropropyl-containing siloxane copolymers, as a result of so much difference in the reactivities between the component units 2,4,6-trivinyl-2,4,6-trimethylcyclotrisiloxane (V) and hexamethylcyclotrisiloxane (D), thermal and mechanical properties of the PDMS-PMVS copolymers obtained by simultaneous copolymerization was similar to those of block copolymers. Only the copolymers obtained by semi-batch method showed properties typical for gradient copolymers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development and the growth of plants is strongly affected by the interactions between roots, rootrnassociated organisms and rhizosphere communities. Methods to assess such interactions are hardly torndevelop particularly in perennial and woody plants, due to their complex root system structure and theirrntemporal change in physiology patterns. In this respect, grape root systems are not investigated veryrnwell. The aim of the present work was the development of a method to assess and predict interactionsrnat the root system of rootstocks (Vitis berlandieri x Vitis riparia) in field. To achieve this aim, grapernphylloxera (Daktulosphaira vitifoliae Fitch, Hemiptera, Aphidoidea) was used as a graperoot parasitizingrnmodel.rnTo develop the methodical approach, a longt-term trial (2006-2009) was arranged on a commercial usedrnvineyard in Geisenheim/Rheingau. All 2 to 8 weeks the top most 20 cm of soil under the foliage wallrnwere investigated and root material was extracted (n=8-10). To include temporal, spatial and cultivarrnspecific root system dynamics, the extracted root material was analyzed digitally on the morphologicalrnproperties. The grape phylloxera population was quantified and characterized visually on base of theirrnlarvalstages (oviparous, non oviparous and winged preliminary stages). Infection patches (nodosities)rnwere characterized visually as well, partly supported by digital root color analyses. Due to the knownrneffects of fungal endophytes on the vitality of grape phylloxera infested grapevines, fungal endophytesrnwere isolated from nodosity and root tissue and characterized (morphotypes) afterwards. Further abioticrnand biotic soil conditions of the vineyards were assessed. The temporal, spatial and cultivar specificrnsensitivity of single parameters were analyzed by omnibus tests (ANOVAs) and adjacent post-hoc tests.rnThe relations between different parameters were analyzed by multiple regression models.rnQuantitative parameters to assess the degeneration of nodosity, the development nodosity attachedrnroots and to differentiate between nodosities and other root swellings in field were developed. Significantrndifferences were shown between root dynamic including parameters and root dynamic ignoringrnparameters. Regarding the description of grape phylloxera population and root system dynamic, thernmethod showed a high temporal, spatial and cultivar specific sensitivity. Further, specific differencesrncould be shown in the frequency of endophyte morphotypes between root and nodosity tissue as wellrnas between cultivars. Degeneration of nodosities as well as nodosity occupation rates could be relatedrnto the calculated abundances of grape phylloxera population. Further ecological questions consideringrngrape root development (e.g. relation between moisture and root development) and grape phylloxerarnpopulation development (e.g. relation between temperature and population structure) could be answeredrnfor field conditions.rnGenerally, the presented work provides an approach to evaluate vitality of grape root systems. Thisrnapproach can be useful, considering the development of control strategies against soilborne pests inrnviticulture (e.g. grape phylloxera, Sorospheara viticola, Roesleria subterranea (Weinm.) Redhaed) as well as considering the evaluation of integrated management systems in viticulture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research primarily represents a contribution to the lobbying regulation research arena. It introduces an index which for the first time attempts to measure the direct compliance costs of lobbying regulation. The Cost Indicator Index (CII) offers a brand new platform for qualitative and quantitative assessment of adopted lobbying laws and proposals of those laws, both in the comparative and the sui generis dimension. The CII is not just the only new tool introduced in the last decade, but it is the only tool available for comparative assessments of the costs of lobbying regulations. Beside the qualitative contribution, the research introduces an additional theoretical framework for complementary qualitative analysis of the lobbying laws. The Ninefold theory allows a more structured assessment and classification of lobbying regulations, both by indication of benefits and costs. Lastly, this research introduces the Cost-Benefit Labels (CBL). These labels might improve an ex-ante lobbying regulation impact assessment procedure, primarily in the sui generis perspective. In its final part, the research focuses on four South East European countries (Slovenia, Serbia, Montenegro and Macedonia), and for the first time brings them into the discussion and calculates their CPI and CII scores. The special focus of the application was on Serbia, whose proposal on the Law on Lobbying has been extensively analysed in qualitative and quantitative terms, taking into consideration specific political and economic circumstances of the country. Although the obtained results are of an indicative nature, the CII will probably find its place within the academic and policymaking arena, and will hopefully contribute to a better understanding of lobbying regulations worldwide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The physico-chemical characterization, structure-pharmacokinetic and metabolism studies of new semi synthetic analogues of natural bile acids (BAs) drug candidates have been performed. Recent studies discovered a role of BAs as agonists of FXR and TGR5 receptor, thus opening new therapeutic target for the treatment of liver diseases or metabolic disorders. Up to twenty new semisynthetic analogues have been synthesized and studied in order to find promising novel drugs candidates. In order to define the BAs structure-activity relationship, their main physico-chemical properties (solubility, detergency, lipophilicity and affinity with serum albumin) have been measured with validated analytical methodologies. Their metabolism and biodistribution has been studied in “bile fistula rat”, model where each BA is acutely administered through duodenal and femoral infusion and bile collected at different time interval allowing to define the relationship between structure and intestinal absorption and hepatic uptake ,metabolism and systemic spill-over. One of the studied analogues, 6α-ethyl-3α7α-dihydroxy-5β-cholanic acid, analogue of CDCA (INT 747, Obeticholic Acid (OCA)), recently under approval for the treatment of cholestatic liver diseases, requires additional studies to ensure its safety and lack of toxicity when administered to patients with a strong liver impairment. For this purpose, CCl4 inhalation to rat causing hepatic decompensation (cirrhosis) animal model has been developed and used to define the difference of OCA biodistribution in respect to control animals trying to define whether peripheral tissues might be also exposed as a result of toxic plasma levels of OCA, evaluating also the endogenous BAs biodistribution. An accurate and sensitive HPLC-ES-MS/MS method is developed to identify and quantify all BAs in biological matrices (bile, plasma, urine, liver, kidney, intestinal content and tissue) for which a sample pretreatment have been optimized.