874 resultados para relevance
Resumo:
Tese de doutoramento, Biologia (Microbiologia), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
Against Beck’s claims that conventional sociological concepts and categories are zombie categories, this paper argues that Durkheim’s theoretical framework in which suicide is a symptom of an anomic state of society can help us understand the diversity of trajectories that transnational migrants follow and that shape their suicide rates within a cosmopolitan society. Drawing on ethnographic data collected on eight suicides and three attempted suicide cases of second-generation male Alevi Kurdish migrants living in London, this article explains the impact of segmented assimilation/adaptation trajectories on the incidence of suicide and how their membership of a ‘new rainbow underclass’, as a manifestation of cosmopolitan society, is itself an anomic social position with a lack of integration and regulation.
Resumo:
Nanomaterials (NMs) with the same chemistry can greatly differ by size, surface area, shape, stability, rigidness, coating or electrical charge and these characteristics affect nano-bio interactions, leading to different toxic potential. In this communication is shown that closely related NMs can have different genotoxic effects, evidencing the importance of investigating the toxic potential of each NM individually, instead of assuming a common mechanism and equal genotoxic effects for a set of similar NMs. The importance of considering complexity of in vivo systems in nanotoxicology, such as the use of tridimensional cellular models, air-liquid interface exposure or in vivo models that mimic human routes of exposure, is underlined.
Resumo:
Phenylketonuria is an inborn error of metabolism, involving, in most cases, a deficient activity of phenylalanine hydroxylase. Neonatal diagnosis and a prompt special diet (low phenylalanine and natural-protein restricted diets) are essential to the treatment. The lack of data concerning phenylalanine contents of processed foodstuffs is an additional limitation for an already very restrictive diet. Our goals were to quantify protein (Kjeldahl method) and amino acid (18) content (HPLC/fluorescence) in 16 dishes specifically conceived for phenylketonuric patients, and compare the most relevant results with those of several international food composition databases. As might be expected, all the meals contained low protein levels (0.67–3.15 g/100 g) with the highest ones occurring in boiled rice and potatoes. These foods also contained the highest amounts of phenylalanine (158.51 and 62.65 mg/100 g, respectively). In contrast to the other amino acids, it was possible to predict phenylalanine content based on protein alone. Slight deviations were observed when comparing results with the different food composition databases.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology.
Resumo:
This work project aims at analysing choices related to Comprehensive income (CI) of Portuguese listed firms and understanding the reasons behind them. Additionally, it studies the relevance of CI versus Net Income (NI). It was found that firm’s size and volume of Other comprehensive income (OCI) are positively related with the choice for separate statements while smaller firms with positive NI and negative OCI tend to disclose less information about taxes. The value relevance of CI proved to be superior to that of NI but OCI seems to have no incremental value relevance.
Resumo:
A series of studies in schizophrenic patients report a decrease of glutathione (GSH) in prefrontal cortex (PFC) and cerebrospinal fluid, a decrease in mRNA levels for two GSH synthesizing enzymes and a deficit in parvalbumin (PV) expression in a subclass of GABA neurons in PFC. GSH is an important redox regulator, and its deficit could be responsible for cortical anomalies, particularly in regions rich in dopamine innervation. We tested in an animal model if redox imbalance (GSH deficit and excess extracellular dopamine) during postnatal development would affect PV-expressing neurons. Three populations of interneurons immunolabeled for calcium-binding proteins were analyzed quantitatively in 16-day-old rat brain sections. Treated rats showed specific reduction in parvalbumin immunoreactivity in the anterior cingulate cortex, but not for calbindin and calretinin. These results provide experimental evidence for the critical role of redox regulation in cortical development and validate this animal model used in schizophrenia research.
Resumo:
Novel cancer vaccines are capableto efficiently induce and boost humantumor antigen specific T-cells. However,the properties of these CD8T-cells are only partially characterized.For in depth investigation ofT-cells following Melan-A/MART-1peptide vaccination in melanoma patients,we conducted a detailed prospectivestudy at the single cell level.We first sorted individual human naiveand effector CD8 T-cells from peripheralblood by flow cytometry, andtested a modified RT-PCR protocolincluding a global amplification ofexpressed mRNAs to obtain sufficientcDNAfromsingle cells.We successfullydetected the expression ofseveral specific genes of interest evendown to 106-fold dilution (equivalentto 10-5 cell). We then analyzed tumor-specific effector memory (EM)CD8T-cell subpopulations ex vivo, assingle cells from vaccinated melanomapatients. To elucidate the hallmarksof effective immunity the genesignatures were defined by a panel ofgenes related to effector functions(e.g. IFN-, granzyme B, perforin),and individual clonotypes were identifiedaccording to the expression ofdistinct T-cell receptors (TCR). Usingthis novel single cell analysis approach,we observed that T-cell differentiationis clonotype dependent,with a progressive restriction in TCRBV clonotype diversity from EMCD28pos to EMCD28neg subsets. However,the effector function gene imprintingis clonotype-independent,but dependent on differentiation,since it correlates with the subset oforigin (EMCD28pos or EMCD28neg). We also conducted a detailedcomparative analysis after vaccinationwith natural vs. analog Melan-Apeptide. We found that the peptideused for vaccination determines thefunctional outcome of individualT-cell clonotypes, with native peptideinducing more potent effector functions.Yet, selective clonotypic expansionwith differentiation was preservedregardless of the peptide usedfor vaccination. In summary, the exvivo single cell RT-PCR approach ishighly sensitive and efficient, andrepresents a reliable and powerfultool to refine our current view of molecularprocesses taking place duringT-cell differentiation.
Resumo:
The debate on the merits of observational studies as compared with randomized trials is ongoing. We will briefly touch on this subject, and demonstrate the role of cohort studies for the description of infectious disease patterns after transplantation. The potential benefits of cohort studies for the clinical management of patients outside of the expected gain in epidemiological knowledge are reviewed. The newly established Swiss Transplantation Cohort Study and in particular the part focusing on infectious diseases will serve as an illustration. A neglected area of research is the indirect value of large, multicenter cohort studies. These benefits can range from a deepened collaboration to the development of common definitions and guidelines. Unfortunately, very few data exist on the role of such indirect effects on improving quality of patient management. This review postulates an important role for cohort studies, which should not be viewed as inferior but complementary to established research tools, in particular randomized trials. Randomized trials remain the least bias-prone method to establish knowledge regarding the significance of diagnostic or therapeutic measures. Cohort studies have the power to reflect a real-world situation and to pinpoint areas of knowledge as well as of uncertainty. Prerequisite is a prospective design requiring a set of inclusive data coupled with the meticulous insistence on data retrieval and quality.
Resumo:
Aims To investigate whether differences in gender-income equity at country level explain national differences in the links between alcohol use, and the combination of motherhood and paid labour. Design Cross-sectional data in 16 established market economies participating in the Gender, Alcohol and Culture: An International Study (GenACIS) study. Setting Population surveys. Participants A total of 12 454 mothers (aged 25-49 years). Measurements Alcohol use was assessed as the quantity per drinking day. Paid labour, having a partner, gender-income ratio at country level and the interaction between individual and country characteristics were regressed on alcohol consumed per drinking day using multi-level modelling. Findings Mothers with a partner who were in paid labour reported consuming more alcohol on drinking days than partnered housewives. In countries with high gender-income equity, mothers with a partner who were in paid labour drank less alcohol per occasion, while alcohol use was higher among working partnered mothers living in countries with lower income equity. Conclusion In countries which facilitate working mothers, daily alcohol use decreases as female social roles increase; in contrast, in countries where there are fewer incentives for mothers to remain in work, the protective effect of being a working mother (with partner) on alcohol use is weaker. These data suggest that a country's investment in measures to improve the compatibility of motherhood and paid labour may reduce women's alcohol use.
Resumo:
ADN subit une série de transformations structurelles complexes au cours de la division cellulaire, ce qui entraîne dans son compactage chromosomes mitotiques par un processus appelé la condensation des chromosomes. Le complexe de condensine pentamérique est fortement impliqué comme un effecteur majeur de ce phénomène. Il s'agit d'un complexe protéine de sous-unités multiples avec deux sous-unités catalytiques [SMC- Structural Maintenance of Chromosomes] et de trois sous-unités de régulation, hautement conservés de la levure à l'homme. Le complexe de condensine dans Saccharomyces cerevisiae est constitué de deux sous-unités de SMC [Smc2 et Smc4] et trois protéines non réglementaires [Brn1, Ycs4, Ycg1]. Malgré son importance, le mécanisme d'action de condensine reste largement inconnu. Par conséquent, l'objectif de cette recherche est de comprendre le mécanisme d'action de condensine et comment elle est affectée par l'interaction entre ses sous-unités réglementaires et non-réglementaires. Cette thèse identifie quatre morphologies dépendants du cycle cellulaire distincts du locus d'ADNr. Cette transformation du phénotype ADNr de G1 à la mitose dépend condensine. Afin de déterminer le rôle de l'interaction entre les sous-unités catalytiques et réglementaires de condensine dans la régulation du complexe condensine, nous avons identifié six résidus positifs sur l'extrémité C-terminale de BRN1 qui affectent la formation du complexe condensine, l'activité de la condensation et l'interaction avec tubuline, ce qui suggère que ces résidus ont un rôle dans la régulation de condensine. Ensemble, nos résultats suggèrent un modèle de règlement du condensine par l'interaction entre les sous-unités de condensine.
Resumo:
We work out a semiclassical theory of shot noise in ballistic n+-i-n+ semiconductor structures aiming at studying two fundamental physical correlations coming from Pauli exclusion principle and long-range Coulomb interaction. The theory provides a unifying scheme which, in addition to the current-voltage characteristics, describes the suppression of shot noise due to Pauli and Coulomb correlations in the whole range of system parameters and applied bias. The whole scenario is summarized by a phase diagram in the plane of two dimensionless variables related to the sample length and contact chemical potential. Here different regions of physical interest can be identified where only Coulomb or only Pauli correlations are active, or where both are present with different relevance. The predictions of the theory are proven to be fully corroborated by Monte Carlo simulations.
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model.
Resumo:
El llibre ressenyat pretén ser una síntesi i una avaluació de la situació de la geografia als Estats Units. Estructurat en vuit capítols, els sis primers són una anàlisi de diversos aspectes de la geografia i els dos últims expliquen quina estratègia hauria de portar-se a terme per enfortir el paper de la geografia en el món acadèmic i en la societat en general