848 resultados para Subtropical values and principle
Resumo:
L’ermeneutica filosofica di Hans-Georg Gadamer – indubbiamente uno dei capisaldi del pensiero novecentesco – rappresenta una filosofia molto composita, sfaccettata e articolata, per così dire formata da una molteplicità di dimensioni diverse che si intrecciano l’una con l’altra. Ciò risulta evidente già da un semplice sguardo alla composizione interna della sua opera principale, Wahrheit und Methode (1960), nella quale si presenta una teoria del comprendere che prende in esame tre differenti dimensioni dell’esperienza umana – arte, storia e linguaggio – ovviamente concepite come fondamentalmente correlate tra loro. Ma questo quadro d’insieme si complica notevolmente non appena si prendano in esame perlomeno alcuni dei numerosi contributi che Gadamer ha scritto e pubblicato prima e dopo il suo opus magnum: contributi che testimoniano l’importante presenza nel suo pensiero di altre tematiche. Di tale complessità, però, non sempre gli interpreti di Gadamer hanno tenuto pienamente conto, visto che una gran parte dei contributi esegetici sul suo pensiero risultano essenzialmente incentrati sul capolavoro del 1960 (ed in particolare sui problemi della legittimazione delle Geisteswissenschaften), dedicando invece minore attenzione agli altri percorsi che egli ha seguito e, in particolare, alla dimensione propriamente etica e politica della sua filosofia ermeneutica. Inoltre, mi sembra che non sempre si sia prestata la giusta attenzione alla fondamentale unitarietà – da non confondere con una presunta “sistematicità”, da Gadamer esplicitamente respinta – che a dispetto dell’indubbia molteplicità ed eterogeneità del pensiero gadameriano comunque vige al suo interno. La mia tesi, dunque, è che estetica e scienze umane, filosofia del linguaggio e filosofia morale, dialogo con i Greci e confronto critico col pensiero moderno, considerazioni su problematiche antropologiche e riflessioni sulla nostra attualità sociopolitica e tecnoscientifica, rappresentino le diverse dimensioni di un solo pensiero, le quali in qualche modo vengono a convergere verso un unico centro. Un centro “unificante” che, a mio avviso, va individuato in quello che potremmo chiamare il disagio della modernità. In altre parole, mi sembra cioè che tutta la riflessione filosofica di Gadamer, in fondo, scaturisca dalla presa d’atto di una situazione di crisi o disagio nella quale si troverebbero oggi il nostro mondo e la nostra civiltà. Una crisi che, data la sua profondità e complessità, si è per così dire “ramificata” in molteplici direzioni, andando ad investire svariati ambiti dell’esistenza umana. Ambiti che pertanto vengono analizzati e indagati da Gadamer con occhio critico, cercando di far emergere i principali nodi problematici e, alla luce di ciò, di avanzare proposte alternative, rimedi, “correttivi” e possibili soluzioni. A partire da una tale comprensione di fondo, la mia ricerca si articola allora in tre grandi sezioni dedicate rispettivamente alla pars destruens dell’ermeneutica gadameriana (prima e seconda sezione) ed alla sua pars costruens (terza sezione). Nella prima sezione – intitolata Una fenomenologia della modernità: i molteplici sintomi della crisi – dopo aver evidenziato come buona parte della filosofia del Novecento sia stata dominata dall’idea di una crisi in cui verserebbe attualmente la civiltà occidentale, e come anche l’ermeneutica di Gadamer possa essere fatta rientrare in questo discorso filosofico di fondo, cerco di illustrare uno per volta quelli che, agli occhi del filosofo di Verità e metodo, rappresentano i principali sintomi della crisi attuale. Tali sintomi includono: le patologie socioeconomiche del nostro mondo “amministrato” e burocratizzato; l’indiscriminata espansione planetaria dello stile di vita occidentale a danno di altre culture; la crisi dei valori e delle certezze, con la concomitante diffusione di relativismo, scetticismo e nichilismo; la crescente incapacità a relazionarsi in maniera adeguata e significativa all’arte, alla poesia e alla cultura, sempre più degradate a mero entertainment; infine, le problematiche legate alla diffusione di armi di distruzione di massa, alla concreta possibilità di una catastrofe ecologica ed alle inquietanti prospettive dischiuse da alcune recenti scoperte scientifiche (soprattutto nell’ambito della genetica). Una volta delineato il profilo generale che Gadamer fornisce della nostra epoca, nella seconda sezione – intitolata Una diagnosi del disagio della modernità: il dilagare della razionalità strumentale tecnico-scientifica – cerco di mostrare come alla base di tutti questi fenomeni egli scorga fondamentalmente un’unica radice, coincidente peraltro a suo giudizio con l’origine stessa della modernità. Ossia, la nascita della scienza moderna ed il suo intrinseco legame con la tecnica e con una specifica forma di razionalità che Gadamer – facendo evidentemente riferimento a categorie interpretative elaborate da Max Weber, Martin Heidegger e dalla Scuola di Francoforte – definisce anche «razionalità strumentale» o «pensiero calcolante». A partire da una tale visione di fondo, cerco quindi di fornire un’analisi della concezione gadameriana della tecnoscienza, evidenziando al contempo alcuni aspetti, e cioè: primo, come l’ermeneutica filosofica di Gadamer non vada interpretata come una filosofia unilateralmente antiscientifica, bensì piuttosto come una filosofia antiscientista (il che naturalmente è qualcosa di ben diverso); secondo, come la sua ricostruzione della crisi della modernità non sfoci mai in una critica “totalizzante” della ragione, né in una filosofia della storia pessimistico-negativa incentrata sull’idea di un corso ineluttabile degli eventi guidato da una razionalità “irrazionale” e contaminata dalla brama di potere e di dominio; terzo, infine, come la filosofia di Gadamer – a dispetto delle inveterate interpretazioni che sono solite scorgervi un pensiero tradizionalista, autoritario e radicalmente anti-illuminista – non intenda affatto respingere l’illuminismo scientifico moderno tout court, né rinnegarne le più importanti conquiste, ma più semplicemente “correggerne” alcune tendenze e recuperare una nozione più ampia e comprensiva di ragione, in grado di render conto anche di quegli aspetti dell’esperienza umana che, agli occhi di una razionalità “limitata” come quella scientista, non possono che apparire come meri residui di irrazionalità. Dopo aver così esaminato nelle prime due sezioni quella che possiamo definire la pars destruens della filosofia di Gadamer, nella terza ed ultima sezione – intitolata Una terapia per la crisi della modernità: la riscoperta dell’esperienza e del sapere pratico – passo quindi ad esaminare la sua pars costruens, consistente a mio giudizio in un recupero critico di quello che egli chiama «un altro tipo di sapere». Ossia, in un tentativo di riabilitazione di tutte quelle forme pre- ed extra-scientifiche di sapere e di esperienza che Gadamer considera costitutive della «dimensione ermeneutica» dell’esistenza umana. La mia analisi della concezione gadameriana del Verstehen e dell’Erfahrung – in quanto forme di un «sapere pratico (praktisches Wissen)» differente in linea di principio da quello teorico e tecnico – conduce quindi ad un’interpretazione complessiva dell’ermeneutica filosofica come vera e propria filosofia pratica. Cioè, come uno sforzo di chiarificazione filosofica di quel sapere prescientifico, intersoggettivo e “di senso comune” effettivamente vigente nella sfera della nostra Lebenswelt e della nostra esistenza pratica. Ciò, infine, conduce anche inevitabilmente ad un’accentuazione dei risvolti etico-politici dell’ermeneutica di Gadamer. In particolare, cerco di esaminare la concezione gadameriana dell’etica – tenendo conto dei suoi rapporti con le dottrine morali di Platone, Aristotele, Kant e Hegel – e di delineare alla fine un profilo della sua ermeneutica filosofica come filosofia del dialogo, della solidarietà e della libertà.
Resumo:
The main part of this thesis describes a method of calculating the massless two-loop two-point function which allows expanding the integral up to an arbitrary order in the dimensional regularization parameter epsilon by rewriting it as a double Mellin-Barnes integral. Closing the contour and collecting the residues then transforms this integral into a form that enables us to utilize S. Weinzierl's computer library nestedsums. We could show that multiple zeta values and rational numbers are sufficient for expanding the massless two-loop two-point function to all orders in epsilon. We then use the Hopf algebra of Feynman diagrams and its antipode, to investigate the appearance of Riemann's zeta function in counterterms of Feynman diagrams in massless Yukawa theory and massless QED. The class of Feynman diagrams we consider consists of graphs built from primitive one-loop diagrams and the non-planar vertex correction, where the vertex corrections only depend on one external momentum. We showed the absence of powers of pi in the counterterms of the non-planar vertex correction and diagrams built by shuffling it with the one-loop vertex correction. We also found the invariance of some coefficients of zeta functions under a change of momentum flow through these vertex corrections.
Resumo:
Throughout the world, pressures on water resources are increasing, mainly as a result of human activity. Because of their accessibility, groundwater and surface water are the most used reservoirs. The evaluation of the water quality requires the identification of the interconnections among the water reservoirs, natural landscape features, human activities and aquatic health. This study focuses on the estimation of the water pollution linked to two different environmental issues: salt water intrusion and acid mine drainage related to the exploitation of natural resources. Effects of salt water intrusion occurring in the shallow aquifer north of Ravenna (Italy) was analysed through the study of ion- exchange occurring in the area and its variance throughout the year, applying a depth-specific sampling method. In the study area were identified ion exchange, calcite and dolomite precipitation, and gypsum dissolution and sulphate reduction as the main processes controlling the groundwater composition. High concentrations of arsenic detected only at specific depth indicate its connexion with the organic matter. Acid mine drainage effects related to the tin extraction in the Bolivian Altiplano was studied, on water and sediment matrix. Water contamination results strictly dependent on the seasonal variation, on pH and redox conditions. During the dry season the strong evaporation and scarce water flow lead to low pH values, high concentrations of heavy metals in surface waters and precipitation of secondary minerals along the river, which could be released in oxidizing conditions as demonstrated through the sequential extraction analysis. The increase of the water flow during the wet season lead to an increase of pH values and a decrease in heavy metal concentrations, due to dilution effect and, as e.g. for the iron, to precipitation.
Resumo:
This research has focused on the study of the behavior and of the collapse of masonry arch bridges. The latest decades have seen an increasing interest in this structural type, that is still present and in use, despite the passage of time and the variation of the transport means. Several strategies have been developed during the time to simulate the response of this type of structures, although even today there is no generally accepted standard one for assessment of masonry arch bridges. The aim of this thesis is to compare the principal analytical and numerical methods existing in literature on case studies, trying to highlight values and weaknesses. The methods taken in exam are mainly three: i) the Thrust Line Analysis Method; ii) the Mechanism Method; iii) the Finite Element Methods. The Thrust Line Analysis Method and the Mechanism Method are analytical methods and derived from two of the fundamental theorems of the Plastic Analysis, while the Finite Element Method is a numerical method, that uses different strategies of discretization to analyze the structure. Every method is applied to the case study through computer-based representations, that allow a friendly-use application of the principles explained. A particular closed-form approach based on an elasto-plastic material model and developed by some Belgian researchers is also studied. To compare the three methods, two different case study have been analyzed: i) a generic masonry arch bridge with a single span; ii) a real masonry arch bridge, the Clemente Bridge, built on Savio River in Cesena. In the analyses performed, all the models are two-dimensional in order to have results comparable between the different methods taken in exam. The different methods have been compared with each other in terms of collapse load and of hinge positions.
Resumo:
Conformemente ai trattati, l'UE sviluppa una politica comune in materia di asilo, immigrazione e controllo delle frontiere esterne, fondata sulla solidarietà e sul rispetto dei diritti fondamentali e, a tal fine, avvia relazioni strategiche con i Paesi terzi e le Organizzazioni internazionali. Un fenomeno “senza frontiere”, quale quello migratorio, esige del resto un'azione coerente e coordinata sia sul piano interno sia su quello esterno. La messa in atto di quest'ultima, tuttavia, si scontra con difficoltà di rilievo. Innanzitutto, l'UE e i suoi Stati membri devono creare i presupposti per l'avvio della collaborazione internazionale, vale a dire stimolare la fiducia reciproca con i Paesi terzi, rafforzando la propria credibilità internazionale. A tal fine, le istituzioni, gli organi e gli organismi dell'UE e gli Stati membri devono impegnarsi a fornire un modello coerente di promozione dei valori fondanti, quali la solidarietà e il rispetto dei diritti fondamentali, nonché a coordinare le proprie iniziative, per individuare, insieme ai Paesi terzi e alle Organizzazioni internazionali, una strategia d'azione comune. In secondo luogo, l'UE e gli Stati membri devono adottare soluzioni volte a promuovere l'efficacia della collaborazione internazionale e, più precisamente, assicurare che la competenza esterna sia esercitata dal livello di governo in grado di apportare il valore aggiunto e utilizzare la forma collaborativa di volta in volta più adeguata alla realizzazione degli obiettivi previsti. In definitiva, l'azione esterna dell'UE in materia di politica migratoria necessita di una strategia coerente e flessibile. Se oggi la coerenza è garantita dalla giustiziabilità dei principi di solidarietà, di rispetto dei diritti fondamentali e, giustappunto, di coerenza, la flessibilità si traduce nel criterio del valore aggiunto che, letto in combinato disposto con il principio di leale cooperazione, si pone al centro del nuovo modello partenariale proposto dall'approccio globale, potenzialmente idoneo a garantire la gestione efficace del fenomeno migratorio.
Resumo:
In recent years, the use of Reverse Engineering systems has got a considerable interest for a wide number of applications. Therefore, many research activities are focused on accuracy and precision of the acquired data and post processing phase improvements. In this context, this PhD Thesis deals with the definition of two novel methods for data post processing and data fusion between physical and geometrical information. In particular a technique has been defined for error definition in 3D points’ coordinates acquired by an optical triangulation laser scanner, with the aim to identify adequate correction arrays to apply under different acquisition parameters and operative conditions. Systematic error in data acquired is thus compensated, in order to increase accuracy value. Moreover, the definition of a 3D thermogram is examined. Object geometrical information and its thermal properties, coming from a thermographic inspection, are combined in order to have a temperature value for each recognizable point. Data acquired by an optical triangulation laser scanner are also used to normalize temperature values and make thermal data independent from thermal-camera point of view.
Electrostatic supramolecular assembly of charged dendritic polymers and their biological application
Resumo:
The aim of this study was the development of functional multilayer films through electrostatic layer by layer (LbL) assembly of dendritic macromolecules, the investigation of the fundamental properties of these multilalyered films and the study of their biological applications. rnThe synthesis of the anionic hyperbranched polyglycerols (hbPG) and the preparation of multilayers made of hbPG/phosphorus dendrimer as well as the influences of deposition conditions on multilayers were reported. The thicknesses of multilayer films increase with a decrease of molecular weight of anionic hbPGs. The multilayer films fabricated by low molecular weight hbPGs grow less regularly due to the less charged carboxylic acid groups providing the relative weaker electrostatic forces for the deposition. The thicknesses of multilayer films are reduced with increasing pH values and decreasing the concentration of NaCl. The observed changes of multilayer thickness and surface morphology could be interpreted with the aid of theories regarding the charge density and conformation of the anionic hbPG chains in solution. rnBesides the study of fundamental properties of hbPG/phosphorus multilayer films, antifouling thin films derived from hbPG layers were developed. The antifouling properties of hbPG layers were found to correlate with factors of the molecular weight of anionic hbPG and the film thickness. It was demonstrated that anionic hbPG single layer with highest molecular weight can reduce non specific protein adsorption more efficiently than single layer with lower molecular weight and all the hbPG bilayers possessed excellent property of antifouling. rnPhosphorus dendrimer multilayers were successfully prepared as the platforms to detect DNA immobilization and hybridization. The effect of NaCl concentration on the multilayer film thickness was evaluated to obtain the optimized film thickness. Making use of the multilayer deposited at the optimized condition as a substrate, a high loading of DNA probes was achieved through covalent coupling of probe DNA with the as-formed multilayer films. The hybridization of target DNA with immobilized probe DNA was then carried out and studied by SPFS. The limit of detection upon hybridization was estimated on various dendrimer multilayer platforms. The minimum detection concentration for DNA hybridization is in the same order of magnitude compared with other neutral phosphorus dendrimer systems. Furthermore, the LbL deposition of phosphorus dendrimer multilayers provided a mild and simple way to prepare platforms as DNA microarrays. rnBased on the phosphorus dendrimer multilayer systems, dendritic star polymers were employed which have more reactive groups than that phosphorus dendrimers. The as-assembled dendritic star polymer multilayer films exhibited such distinct morphology characteristics that they underwent extensive structural reorganization upon post-treatment under different pH conditions. Kinetic binding of probe DNA molecules on the outermost negatively charged dendritic surface was studied by SPR as well. The binding capacities of probe DNA on the multilayer surfaces fabricated from the first-generation and the second-generation of dendritic star polymers were compared. The improved binding capacity was achieved from the second-generation of dendritic star polymer multilayer films due to their more reactive groups. DNA hybridization reaction on dendritic multilayer films was investigated by SPFS. The similar hybridization behaviors were found on both multilayer surfaces. Meanwhile, the hybridization kinetic affinities were compared with that of phosphorus dendrimer multilayer surfaces and showed improved detection sensitivity than phosphorus dendrimer multilayer films.rn
Resumo:
At the time of writing, all three elements that are evoked in the title – emancipation and social inclusion of sexual minorities, labour and labour activism, and the idea and substance of “Europe” – are being invested by deep, long-term, and – to varied degrees – radical processes of social transformation. The meaning of words like “equality”, “rights”, “inclusion”, and even “democracy” is as precarious and uncertain as are the lives of those European citizens who are marginalised by intersecting conditions of gender, sexuality, ethnicity, and class – in a constellation of precarities that is both unifying and fragmented (fragmenting). Conflicts are played, in hidden or explicit ways, over material processes of redistribution as well as discursive practices that revolve around these words. Against this backdrop, and roughly ten years after the European Union provided an input for institutional commitment to the protection of LGBT* workers' rights with the Council Directive 2000/78/EC, the dissertation contrasts discourses on workplace equality for LGBT* persons produced by a plurality of actors, seeking to identify values, semantics, and agendas framing and informing organisations’ views and showing how each actor has incorporated LGBT* rights into its own discourse, each time in a way that is functional to the construction and/or confirmation of its organisational identity: transnational union networks, by presenting LGBT* rights as a natural, neutral commitment within the framework of universal human rights protection; left-wing organisations, by collocating activism for LGBT* rights within a wider project of social emancipation that is for all the marginalised, yet is not neutral, but attached to specific values and opposed to specific political adversaries (the right-wing, the nationalists); business networks, by acknowledging diversity as a path to better performance and profits, thus encouraging inclusion and non-discrimination of “deserving” LGBT* workers.
Resumo:
Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.
Towards the 3D attenuation imaging of active volcanoes: methods and tests on real and simulated data
Resumo:
The purpose of my PhD thesis has been to face the issue of retrieving a three dimensional attenuation model in volcanic areas. To this purpose, I first elaborated a robust strategy for the analysis of seismic data. This was done by performing several synthetic tests to assess the applicability of spectral ratio method to our purposes. The results of the tests allowed us to conclude that: 1) spectral ratio method gives reliable differential attenuation (dt*) measurements in smooth velocity models; 2) short signal time window has to be chosen to perform spectral analysis; 3) the frequency range over which to compute spectral ratios greatly affects dt* measurements. Furthermore, a refined approach for the application of spectral ratio method has been developed and tested. Through this procedure, the effects caused by heterogeneities of propagation medium on the seismic signals may be removed. The tested data analysis technique was applied to the real active seismic SERAPIS database. It provided a dataset of dt* measurements which was used to obtain a three dimensional attenuation model of the shallowest part of Campi Flegrei caldera. Then, a linearized, iterative, damped attenuation tomography technique has been tested and applied to the selected dataset. The tomography, with a resolution of 0.5 km in the horizontal directions and 0.25 km in the vertical direction, allowed to image important features in the off-shore part of Campi Flegrei caldera. High QP bodies are immersed in a high attenuation body (Qp=30). The latter is well correlated with low Vp and high Vp/Vs values and it is interpreted as a saturated marine and volcanic sediments layer. High Qp anomalies, instead, are interpreted as the effects either of cooled lava bodies or of a CO2 reservoir. A pseudo-circular high Qp anomaly was detected and interpreted as the buried rim of NYT caldera.
Resumo:
Der Haupt-Lichtsammenkomplex II (LHCII) höherer Pflanzen ist das häufigsternMembranprotein der Welt und in die chloroplastidäre Thylakoidmembran integriert. DerrnLHCII kann als Modellsystem genutzt werden, um die Funktionsweise vonrnMembranproteinen besser zu verstehen, da 96 % seiner Struktur kristallografisch aufgelöstrnist und er in rekombinanter Form in vitro rückgefaltet werden kann. Hierbei entsteht einrnvoll funktionaler Protein-Pigment.Komplex, der nahezu identisch mit der in vivo Varianternist.rnElektronenparamagnetischen Resonanz (EPR) Spektroskopie ist eine hoch sensitive undrnideal geeignete Methode, um die Strukturdynamik von Proteinen zu untersuchen. Hierzurnist eine ortsspezifische Markierung mit Spinsonden notwendig, die kovalent an Cysteinernbinden. Möglich wird dies, indem sorgfältig ausgewählte Aminosäuren gegen Cysteinerngetauscht werden, ohne dass die Funktionsweise des LHCII beeinträchtigt wird.rnIm Rahmen dieser Arbeit wurden die Stabilität des verwendeten Spinmarkers und diernProbenqualität verbessert, indem alle Schritte der Probenpräparation untersucht wurden.rnMithilfe dieser Erkenntnisse konnte sowohl die Gefahr einer Proteinaggregation als auchrnein Verlust des EPR Signals deutlich vermindert werden. In Kombination mit derrngleichzeitigen Etablierung des Q-Band EPR können nun deutlich geringer konzentrierternProben zuverlässig vermessen werden. Darüber hinaus wurde eine reproduzierbarernMethode entwickelt, um heterogene Trimere herzustellen. Diese bestehen aus einemrndoppelt markierten Monomer und zwei unmarkierten Monomeren und erlauben es, diernkristallografisch unvollständig aufgelöste N-terminale Domäne im monomeren undrntrimeren Assemblierungsgrad zu untersuchen. Die Ergebnisse konnten einerseits diernVermutung bestätigen, dass diese Domäne im Vergleich zum starren Proteinkern sehrrnflexibel ist und andererseits, dass sie in Monomeren noch mobiler ist als in Trimeren.rnZudem wurde die lumenale Schleifenregion bei unterschiedlichen pH Werten undrnvariierender Pigmentzusammensetzung untersucht, da dieser Bereich sehr kontroversrndiskutiert wird. Die Messergebnisse offenbarten, dass diese Region starre und flexiblerernSektionen aufweist. Während der pH Wert keinen Einfluss auf die Konformation hatte,rnzeigte sich, dass die Abwesenheit von Neoxanthin zu einer Änderung der Konformationrnführt. Weiterführende Analysen der strukturellen Dynamik des LHCII in einerrnLipidmembran konnten hingegen nicht durchgeführt werden, da dies eine gerichteternInsertion des rückgefalteten Proteins in Liposomen erfordert, was trotz intensiverrnVersuche nicht zum Erfolg führte.
Resumo:
Oceans are key sources and sinks in the global budgets of significant atmospheric trace gases, termed Volatile Organic Compounds (VOCs). Despite their low concentrations, these species have an important role in the atmosphere, influencing ozone photochemistry and aerosol physics. Surprisingly, little work has been done on assessing their emissions or transport mechanisms and rates between ocean and atmosphere, all of which are important when modelling the atmosphere accurately.rnA new Needle Trap Device (NTD) - GC-MS method was developed for the effective sampling and analysis of VOCs in seawater. Good repeatability (RSDs <16 %), linearity (R2 = 0.96 - 0.99) and limits of detection in the range of pM were obtained for DMS, isoprene, benzene, toluene, p-xylene, (+)-α-pinene and (-)-α-pinene. Laboratory evaluation and subsequent field application indicated that the proposed method can be used successfully in place of the more usually applied extraction techniques (P&T, SPME) to extend the suite of species typically measured in the ocean and improve detection limits. rnDuring a mesocosm CO2 enrichment study, DMS, isoprene and α-pinene were identified and quantified in seawater samples, using the above mentioned method. Based on correlations with available biological datasets, the effects of ocean acidification as well as possible ocean biological sources were investigated for all examined compounds. Future ocean's acidity was shown to decrease oceanic DMS production, possibly impact isoprene emissions but not affect the production of α-pinene. rnIn a separate activity, ocean - atmosphere interactions were simulated in a large scale wind-wave canal facility, in order to investigate the gas exchange process and its controlling mechanisms. Air-water exchange rates of 14 chemical species (of which 11 VOCs) spanning a wide range of solubility (dimensionless solubility, α = 0:4 to 5470) and diffusivity (Schmidt number in water, Scw = 594 to 1194) were obtained under various turbulent (wind speed at ten meters height, u10 = 0:8 to 15ms-1) and surfactant modulated (two different sized Triton X-100 layers) surface conditions. Reliable and reproducible total gas transfer velocities were obtained and the derived values and trends were comparable to previous investigations. Through this study, a much better and more comprehensive understanding of the gas exchange process was accomplished. The role of friction velocity, uw* and mean square slope, σs2 in defining phenomena such as waves and wave breaking, near surface turbulence, bubbles and surface films was recognized as very significant. uw* was determined as the ideal turbulent parameter while σs2 described best the related surface conditions. A combination of both uw* and σs2 variables, was found to reproduce faithfully the air-water gas exchange process. rnA Total Transfer Velocity (TTV) model provided by a compilation of 14 tracers and a combination of both uw* and σs2 parameters, is proposed for the first time. Through the proposed TTV parameterization, a new physical perspective is presented which provides an accurate TTV for any tracer within the examined solubility range. rnThe development of such a comprehensive air-sea gas exchange parameterization represents a highly useful tool for regional and global models, providing accurate total transfer velocity estimations for any tracer and any sea-surface status, simplifying the calculation process and eliminating inevitable calculation uncertainty connected with the selection or combination of different parameterizations.rnrn
Resumo:
Introduction: As a previous study revealed, arts speech therapy (AST) affects cardiorespiratory interaction [1]. The aim of the present study was to investigate whether AST also has effects on brain oxygenation and hemodynamics measured non-invasively using near-infrared spectroscopy (NIRS). Material and methods: NIRS measurements were performed on 17 subjects (8 men and 9 women, mean age: 35.6 ± 12.7 y) during AST. Each measurement lasted 35 min, comprising 8 min pre-baseline, 10 min recitation and 20 min post-baseline. For each subject, measurements were performed for three different AST recitation tasks (recitation of alliterative, hexameter and prose verse). Relative concentration changes of oxyhemoglobin (Δ[O2Hb]) and deoxyhemoglobin (Δ[HHb]) as well as the tissue oxygenation index (TOI) were measured using a Hamamatsu NIRO300 NIRS device and a sensor placed on the subjects forehead. Movement artifacts were removed using a novel method [2]. Statistical analysis (Wilcoxon test) was applied to the data to investigate (i) if the recitation causes changes in the median values and/or in the Mayer wave power spectral density (MW-PSD, range: 0.07–0.13 Hz) of Δ[O2Hb], Δ[HHb] or TOI, and (ii) if these changes vary between the 3 recitation forms. Results: For all three recitation styles a significant (p < 0.05) decrease in Δ[O2Hb] and TOI was found, indicating a decrease in blood flow. These decreases did not vary significantly between the three styles. MW-PSD increased significantly for Δ[O2Hb] when reciting the hexameter and prose verse, and for Δ[HHb] and TOI when reciting alliterations and hexameter, representing an increase in Mayer waves. The MW-PSD increase for Δ[O2Hb] was significantly larger for the hexameter verse compared to alliterative and prose verse Conclusion: The study showed that AST affects brain hemodynamics (oxygenation, blood flow and Mayer waves). Recitation caused a significant decrease in cerebral blood flow for all recitation styles as well as an increase in Mayer waves, particularly for the hexameter, which may indicate a sympathetic activation. References 1. D. Cysarz, D. von Bonin, H. Lackner, P. Heusser, M. Moser, H. Bettermann. Am J Physiol Heart Circ Physiol, 287 (2) (2004), pp. H579–H587 2. F. Scholkmann, S. Spichtig, T. Muehlemann, M. Wolf. Physiol Meas, 31 (5) (2010), pp. 649–662
Resumo:
The purpose of this study was to search the orthodontic literature and determine the frequency of reporting of confidence intervals (CIs) in orthodontic journals with an impact factor. The six latest issues of the American Journal of Orthodontics and Dentofacial Orthopedics, the European Journal of Orthodontics, and the Angle Orthodontist were hand searched and the reporting of CIs, P values, and implementation of univariate or multivariate statistical analyses were recorded. Additionally, studies were classified according to the type/design as cross-sectional, case-control, cohort, and clinical trials, and according to the subject of the study as growth/genetics, behaviour/psychology, diagnosis/treatment, and biomaterials/biomechanics. The data were analyzed using descriptive statistics followed by univariate examination of statistical associations, logistic regression, and multivariate modelling. CI reporting was very limited and was recorded in only 6 per cent of the included published studies. CI reporting was independent of journal, study area, and design. Studies that used multivariate statistical analyses had a higher probability of reporting CIs compared with those using univariate statistical analyses. Misunderstanding of the use of P values and CIs may have important implications in implementation of research findings in clinical practice.
Resumo:
Recent demographic changes have made settlement patterns in the Canadian Arctic increasingly urban. Iqaluit, capital of Canada’s newest territory, Nunavut, is home to the largest concentration of Inuit and non-Inuit populations in the Canadian North. Despite these trends, Inuit cultural identity continues to rest heavily on the perception that to learn how to be authentically Inuit (or to be a better person), a person needs to spend time out on the land (and sea) hunting, fishing, trapping, and camping. Many Inuit also maintain a rather negative view of urban spaces in the Arctic, identifying them as places where Inuit values and practices have been eclipsed by Qallunaat (‘‘white people’’) ones. Some Inuit have even gone so far as to claim that a person is no longer able to be Inuit while living in towns like Iqaluit. This article examines those aspects of Canadian Inuit identity, culture, and tradition that disfavor the acceptance of an urban cultural identity. Based on ethnographic research conducted on Baffin Island in the mid 1990s and early 2000s, the many ways Iqaluit and outpost camp Inuit express the differences and similarities between living on the land and living in town are described. Then follows an examination of how the contrast of land and town is used in the rhetoric of Inuit politicians and leaders. Finally, a series of counterexamples are presented that favor the creation of an authentic urban Inuit identity in the Arctic, including recent attempts on the part of the Nunavut Territorial Government to make education and wage employment in the region more reliant on Inuit Qaujimajatuqangit, or Inuit traditional knowledge.1