422 resultados para scientist
Resumo:
Abstract Objectives To evaluate the prevalence of human papillomavirus (HPV) types, and risk factors for HPV positivity across cervix, vagina and anus, we conducted a study among 138 women with human immunodeficiency virus (HIV). Goal Compare the prevalence of different HPV types and the risk factors for HPV positivity in three sites. Results The most frequently detected HPV types in all sites were, in decreasing order, HPV16, 53, 18, 61 and 81. Agreement between the cervix and vagina was good (kappa 0.60 – 0.80) for HPV16 and 53 and excellent (Kappa > 0.80) for HPV18 and 61. HPV positivity was inversely associated with age for all combinations including the anal site. Conclusion In HIV positive women, HPV18 is the most spread HPV type found in combinations of anal and genital sites. The relationship of anal to genital infection has implications for the development of anal malignancies. Thus, the efficacy of the current HPV vaccine may be considered not only for the cervix, but also for prevention of HPV18 anal infection among immunossuppressed individuals.
Resumo:
[ES] El proyecto consiste en crear un libro de fotografías artísticamente atractivo y de calidad divulgativa y científica sobre la isla de Boa Vista para dar a conocer y sensibilizar con imágenes, sobre la cultura, la riqueza y fragilidad de este enclave de biodiversidad en Cabo Verde. La idea para su edición es el crowdfunding a través de la plataforma web VERKAMI
Resumo:
The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.
Resumo:
Sowohl in der Natur als auch in der Industrie existieren thermisch induzierte Strömungen. Von Interesse für diese Forschungsarbeit sind dabei die Konvektionen im Erdmantel sowie in den Glasschmelzwannen. Der dort stattfindende Materialtransport resultiert aus Unterschieden in der Dichte, der Temperatur und der chemischen Konzentration innerhalb des konvektierenden Materials. Um das Verständnis für die ablaufenden Prozesse zu verbessern, werden von zahlreichen Forschergruppen numerische Modellierungen durchgeführt. Die Verifikation der dafür verwendeten Algorithmen erfolgt meist über die Analyse von Laborexperimenten. Im Vordergrund dieser Forschungsarbeit steht die Entwicklung einer Methode zur Bestimmung der dreidimensionalen Temperaturverteilung für die Untersuchung von thermisch induzierten Strömungen in einem Versuchsbecken. Eine direkte Temperaturmessung im Inneren des Versuchsmaterials bzw. der Glasschmelze beeinflusst allerdings das Strömungsverhalten. Deshalb wird die geodynamisch störungsfrei arbeitende Impedanztomographie verwendet. Die Grundlage dieser Methode bildet der erweiterte Arrhenius-Zusammenhang zwischen Temperatur und spezifischer elektrischer Leitfähigkeit. Während der Laborexperimente wird ein zähflüssiges Polyethylenglykol-Wasser-Gemisch in einem Becken von unten her erhitzt. Die auf diese Weise generierten Strömungen stellen unter Berücksichtigung der Skalierung ein Analogon sowohl zu dem Erdmantel als auch zu den Schmelzwannen dar. Über mehrere Elektroden, die an den Beckenwänden installiert sind, erfolgen die geoelektrischen Messungen. Nach der sich anschließenden dreidimensionalen Inversion der elektrischen Widerstände liegt das Modell mit der Verteilung der spezifischen elektrischen Leitfähigkeit im Inneren des Versuchsbeckens vor. Diese wird mittels der erweiterten Arrhenius-Formel in eine Temperaturverteilung umgerechnet. Zum Nachweis der Eignung dieser Methode für die nichtinvasive Bestimmung der dreidimensionalen Temperaturverteilung wurden mittels mehrerer Thermoelemente an den Beckenwänden zusätzlich direkte Temperaturmessungen durchgeführt und die Werte miteinander verglichen. Im Wesentlichen sind die Innentemperaturen gut rekonstruierbar, wobei die erreichte Messgenauigkeit von der räumlichen und zeitlichen Auflösung der Gleichstromgeoelektrik abhängt.
Resumo:
L'informatica e le sue tecnologie nella società moderna si riassumono spesso in un assioma fuorviante: essa, infatti, è comunemente legata al concetto che ciò che le tecnologie ci offrono può essere accessibile da tutti e sfruttato, all'interno della propria quotidianità, in modi più o meno semplici. Anche se quello appena descritto è un obiettivo fondamentale del mondo high-tech, occorre chiarire subito una questione: l'informatica non è semplicemente tutto ciò che le tecnologie ci offrono, perchè questo pensiero sommario fa presagire ad un'informatica "generalizzante"; l'informatica invece si divide tra molteplici ambiti, toccando diversi mondi inter-disciplinari. L'importanza di queste tecnologie nella società moderna deve spingerci a porre domande, riflessioni sul perchè l'informatica, in tutte le sue sfaccettature, negli ultimi decenni, ha portato una vera e propria rivoluzione nelle nostre vite, nelle nostre abitudini, e non di meno importanza, nel nostro contesto lavorativo e aziendale, e non ha alcuna intenzione (per fortuna) di fermare le proprie possibilità di sviluppo. In questo trattato ci occuperemo di definire una particolare tecnica moderna relativa a una parte di quel mondo complesso che viene definito come "Intelligenza Artificiale". L'intelligenza Artificiale (IA) è una scienza che si è sviluppata proprio con il progresso tecnologico e dei suoi potenti strumenti, che non sono solo informatici, ma soprattutto teorico-matematici (probabilistici) e anche inerenti l'ambito Elettronico-TLC (basti pensare alla Robotica): ecco l'interdisciplinarità. Concetto che è fondamentale per poi affrontare il nocciolo del percorso presentato nel secondo capitolo del documento proposto: i due approcci possibili, semantico e probabilistico, verso l'elaborazione del linguaggio naturale(NLP), branca fondamentale di IA. Per quanto darò un buono spazio nella tesi a come le tecniche di NLP semantiche e statistiche si siano sviluppate nel tempo, verrà prestata attenzione soprattutto ai concetti fondamentali di questi ambiti, perché, come già detto sopra, anche se è fondamentale farsi delle basi e conoscere l'evoluzione di queste tecnologie nel tempo, l'obiettivo è quello a un certo punto di staccarsi e studiare il livello tecnologico moderno inerenti a questo mondo, con uno sguardo anche al domani: in questo caso, la Sentiment Analysis (capitolo 3). Sentiment Analysis (SA) è una tecnica di NLP che si sta definendo proprio ai giorni nostri, tecnica che si è sviluppata soprattutto in relazione all'esplosione del fenomeno Social Network, che viviamo e "tocchiamo" costantemente. L'approfondimento centrale della tesi verterà sulla presentazione di alcuni esempi moderni e modelli di SA che riguardano entrambi gli approcci (statistico e semantico), con particolare attenzione a modelli di SA che sono stati proposti per Twitter in questi ultimi anni, valutando quali sono gli scenari che propone questa tecnica moderna, e a quali conseguenze contestuali (e non) potrebbe portare questa particolare tecnica.
Resumo:
Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.
Resumo:
I Big Data hanno forgiato nuove tecnologie che migliorano la qualità della vita utilizzando la combinazione di rappresentazioni eterogenee di dati in varie discipline. Occorre, quindi, un sistema realtime in grado di computare i dati in tempo reale. Tale sistema viene denominato speed layer, come si evince dal nome si è pensato a garantire che i nuovi dati siano restituiti dalle query funcions con la rapidità in cui essi arrivano. Il lavoro di tesi verte sulla realizzazione di un’architettura che si rifaccia allo Speed Layer della Lambda Architecture e che sia in grado di ricevere dati metereologici pubblicati su una coda MQTT, elaborarli in tempo reale e memorizzarli in un database per renderli disponibili ai Data Scientist. L’ambiente di programmazione utilizzato è JAVA, il progetto è stato installato sulla piattaforma Hortonworks che si basa sul framework Hadoop e sul sistema di computazione Storm, che permette di lavorare con flussi di dati illimitati, effettuando l’elaborazione in tempo reale. A differenza dei tradizionali approcci di stream-processing con reti di code e workers, Storm è fault-tolerance e scalabile. Gli sforzi dedicati al suo sviluppo da parte della Apache Software Foundation, il crescente utilizzo in ambito di produzione di importanti aziende, il supporto da parte delle compagnie di cloud hosting sono segnali che questa tecnologia prenderà sempre più piede come soluzione per la gestione di computazioni distribuite orientate agli eventi. Per poter memorizzare e analizzare queste moli di dati, che da sempre hanno costituito una problematica non superabile con i database tradizionali, è stato utilizzato un database non relazionale: HBase.
Resumo:
Co-production of knowledge between academic and non-academic communities is a prerequisite for research aiming at more sustainable development paths. Sustainability researchers face three challenges in such co-production: (a) addressing power relations; (b) interrelating different perspectives on the issues at stake; and (c) promoting a previously negotiated orientation towards sustainable development. A systematic comparison of four sustainability research projects in Kenya (vulnerability to drought), Switzerland (soil protection), Bolivia and Nepal (conservation vs. development) shows how the researchers intuitively adopted three different roles to face these challenges: the roles of reflective scientist, intermediary, and facilitator of a joint learning process. From this systematized and iterative self-reflection on the roles that a researcher can assume in the indeterminate social space where knowledge is co-produced, we draw conclusions regarding training.
Resumo:
Our second part of the publication entitled "The image of Dentistry" discusses the properties that correspond to the ideal image of dentistry or even the ideal scientist such as the management of the dental practice, the dentist-patient relationship and the appropriate handling of the patient's emotions such as anxiety or pain. The quality of treatment and the friendly, honest and compassionate attitude of the dentist can immediately affect the image of dentistry. Therefore, the dental professional must try to keep the balance between practice profit, staffing and patient well-being in order to fulfill both social and public health responsibilities.
Resumo:
With “Marx Illusion”, Claudiu Coman recalls sociology - this science of analytic and methodological strictness - to its uncorrupted philosophical dignity. Not only the theorization - that is the seduction of quality essay, Claudiu Coman demonstrating here a single ability - but the philosophy itself: a science, if we want, but as a way of cognitive enclosure of the beings of the world, firstly of those who make the world possible, but of the transcendent world that transcends them making possible - as a 20th century scientist would say - the man’s connection to an existence that is not his own, but that identifies itself with the difference. Being in the world - M. Heidegger writes (in 1928, after the apparition of the work “The Being and Time”) - it is specific only to the man - and here within the Territory of Existence I have the impression that philosophy and sociology responded together. However, working together with philosophy, the sociology extends enveloping the “world less” beings, too. These things would already been in the world as “direct” things (“handy”) being caught by the man by referring to them in the world itself.
Resumo:
This paper is meant to provide guidance to anyone wishing to write a neurological guideline for diagnosis or treatment, and is directed at the Scientist Panels and task forces of the European Federation of Neurological Societies (EFNS). It substitutes the previous guidance paper from 2004. It contains several new aspects: the guidance is now based on a change of the grading system for evidence and for the resulting recommendations, and has adopted The Grading of Recommendations, Assessment, Development and Evaluation system (GRADE). The process of grading the quality of evidence and strength of recommendations can now be improved and made more transparent. The task forces embarking on the development of a guideline must now make clearer and more transparent choices about outcomes considered most relevant when searching the literature and evaluating their findings. Thus, the outcomes chosen will be more critical, more patient-oriented and easier to translate into simple recommendations. This paper also provides updated practical recommendations for planning a guideline task force within the framework of the EFNS. Finally, this paper hopes to find the approval also by the relevant bodies of our future organization, the European Academy of Neurology.
Resumo:
It is a challenging time to be a social scientist. Many of the concepts and categories we took for granted have been revealed as temporally and geographically specific. It is now widely accepted that the nation-state is no longer the sole container for economic, political and social processes, if indeed it ever was. This is where Kevin Stenson begins his paper. He traces the re-ordering of both state and nation, highlighting recent discussions about the unbundling and rescaling of the state and outlining how increasing ethnic and cultural diversity challenge homogeneous conceptions of the nation. In Stenson’s account these are largely empirical processes that are the basis for the important questions he raises about changing understandings of publics and social order, and their implications for the local governance of community safety. He contrasts two alternative positions; the ‘universal human rights position’ which refuses to privilege the interests of majority populations, and a more ‘communitarian and nationalistic position’ which he argues is most likely to be deployed by right wing politicians and interests groups. Drawing from extensive research in the Thames Valley region of the United Kingdom, he shows how these two understandings have both shaped the local policy response to crime and disorder.