890 resultados para scenarios of development
Resumo:
Formaldehyde (FA) is a colour less gas widely used in the industry and hospitals as an aqueous solution, formalin. It is extremely reactive and induces various genotoxic effects in proliferating cultured mammalian cells. Tobacco smoke has been epidemiologically associated to a higher risk of development of cancer, especially in the oral cavity, larynx and lungs, as these are places of direct contact with many carcinogenic tobacco’s compounds. Approximately 90% of human cancers originate from epithelial cells. Therefore, it could be argued that oral epithelial cells represent a preferred target site for early genotoxic events induced by carcinogenic agents entering the body via inhalation and ingestion. The cytokinesis-blocked micronucleus assay (CBMN) in human lymphocytes is one of the most commonly used methods for measuring DNA damage, namely the detection of micronucleus, nucleoplasmic bridges, and nuclear buds.
Resumo:
Formaldehyde (FA) is a colourless gas widely used in the industry and hospitals as an aqueous solution, formalin. It is extremely reactive and induces various genotoxic effects in proliferating cultured mammalian cells. Tobacco smoke has been epidemiologically associated to a higher risk of development of cancer, especially in the oral cavity, larynx and lungs, as these are places of direct contact with many carcinogenic tobacco’s compounds. Genetic polymorphisms in enzymes involved in the metabolism are very important and can make changes in the individual susceptibility to disease. Alcohol dehydrogenase class 3 (ADH3), also known as formaldehyde dehydrogenase dependent of glutathione, is the major enzyme involved in the formaldehyde oxidation, especially in the buccal mucosa. The polymorphism in study is a substitution of an isoleucine for a valine in codon 349. The cytokinesis-blocked micronucleus assay (CBMN) in human lymphocytes is one of the most commonly used methods for measuring DNA damage, namely the detection of micronucleus, nucleoplasmic bridges, and nuclear buds, classified as genotoxicity biomarkers.
Resumo:
Formaldehyde is classified by IARC as carcinogenic to humans (nasopharyngeal cancer). Tobacco smoke has been epidemiologically associated to a higher risk of development of cancer, especially in the oral cavity, larynx and lungs, as these are places of direct contact with many carcinogenic tobacco’s compounds. XRCC3 is involved in homologous recombination repair of cross-links and chromosomal double-strand breaks (Thr241Met polymorphism). The aim of the study is to determine whether there is an in vivo association between genetic polymorphism of the gene XRCC3 and the frequency of genotoxicity biomarkers in subjects exposed or not to formaldehyde and with or without tobacco consumption.
Resumo:
Occupational exposure to formaldehyde (FA) has been shown to induce nasopharyngeal cancer and has been classified as carcinogenic to humans (group 1) on the basis of sufficient evidence in humans. Tobacco smoke has been associated to a higher risk of development of cancer, especially in the oral cavity, larynx and lungs, as these are places of direct contact with many carcinogenic tobacco’s compounds. Alcohol is a recognized agent that influence cells in a genotoxic form, been citied as a strong agent with potential in the development of carcinogenic lesions. Epidemiological evidence points to a strong synergistic effect between cigarette smoking and alcohol consumption in the induction of cancers in the oral cavity. Approximately 90% of human cancers originate from epithelial cells. Therefore, it could be argued that oral epithelial cells represent a preferred target site for early genotoxic events induced by carcinogenic agents entering the body via inhalation and ingestion. The MN assay in buccal cells was also used to study cancerous and precancerous lesions and to monitor the effects of a number of chemopreventive agents.
Resumo:
OBJECTIVE: Describe the overall transmission of malaria through a compartmental model, considering the human host and mosquito vector. METHODS: A mathematical model was developed based on the following parameters: human host immunity, assuming the existence of acquired immunity and immunological memory, which boosts the protective response upon reinfection; mosquito vector, taking into account that the average period of development from egg to adult mosquito and the extrinsic incubation period of parasites (transformation of infected but non-infectious mosquitoes into infectious mosquitoes) are dependent on the ambient temperature. RESULTS: The steady state equilibrium values obtained with the model allowed the calculation of the basic reproduction ratio in terms of the model's parameters. CONCLUSIONS: The model allowed the calculation of the basic reproduction ratio, one of the most important epidemiological variables.
Resumo:
This paper seeks to investigate the effectiveness of sea-defense structures in preventing/reducing the tsunami overtopping as well as evaluating the resulting tsunami impact at El Jadida, Morocco. Different tsunami wave conditions are generated by considering various earthquake scenarios of magnitudes ranging from M-w = 8.0 to M-w = 8.6. These scenarios represent the main active earthquake faults in the SW Iberia margin and are consistent with two past events that generated tsunamis along the Atlantic coast of Morocco. The behavior of incident tsunami waves when interacting with coastal infrastructures is analyzed on the basis of numerical simulations of near-shore tsunami waves' propagation. Tsunami impact at the affected site is assessed through computing inundation and current velocity using a high-resolution digital terrain model that incorporates bathymetric, topographic and coastal structures data. Results, in terms of near-shore tsunami propagation snapshots, waves' interaction with coastal barriers, and spatial distributions of flow depths and speeds, are presented and discussed in light of what was observed during the 2011 Tohoku-oki tsunami. Predicted results show different levels of impact that different tsunami wave conditions could generate in the region. Existing coastal barriers around the El Jadida harbour succeeded in reflecting relatively small waves generated by some scenarios, but failed in preventing the overtopping caused by waves from others. Considering the scenario highly impacting the El Jadida coast, significant inundations are computed at the sandy beach and unprotected areas. The modeled dramatic tsunami impact in the region shows the need for additional tsunami standards not only for sea-defense structures but also for the coastal dwellings and houses to provide potential in-place evacuation.
Resumo:
The population growth of a Staphylococcus aureus culture, an active colloidal system of spherical cells, was followed by rheological measurements, under steady-state and oscillatory shear flows. We observed a rich viscoelastic behavior as a consequence of the bacteria activity, namely, of their multiplication and density-dependent aggregation properties. In the early stages of growth (lag and exponential phases), the viscosity increases by about a factor of 20, presenting several drops and full recoveries. This allows us to evoke the existence of a percolation phenomenon. Remarkably, as the bacteria reach their late phase of development, in which the population stabilizes, the viscosity returns close to its initial value. Most probably, this is caused by a change in the bacteria physiological activity and in particular, by the decrease of their adhesion properties. The viscous and elastic moduli exhibit power-law behaviors compatible with the "soft glassy materials" model, whose exponents are dependent on the bacteria growth stage. DOI: 10.1103/PhysRevE.87.030701.
Reproductive dynamics of Sterna hirundinacea Lesson, 1831 in Ilha dos Cardos, Santa Catarina, Brazil
Resumo:
In this work, we intend to describe the reproductive dynamics of Sterna hirundinacea in an island fromSouth Brazil.We studied the reproductive biology of this species in its natural environment and provide data on their growth, survival, and reproductive success in Ilha dosCardos, SantaCatarina, South Brazil. Samplingswere carried out daily on the island throughout the reproductive seasons of 2003, 2005, and 2006 and the different stages of development of the chicks were characterized according to age, length of the beak, and plumage characteristics.We provide a basic equation Lm = 167.91 (1 – e −0.062t−(−0.23)) to determine the approximate age of individuals using their body mass. The main cause of chick mortality on the island was natural (63.17% in 2003, 81.41% in 2005, and 79.96% in 2006), whereas predation contributed to mortality in a proportion of 38.83% in 2003, 18.59% in 2005, and 20.04% in 2006.The absence in the area of the chicks’ main predator, Kelp gull (Larus dominicanus), the large number of chicks that reached the final stages of development, and their reproductive success demonstrate that Ilha dos Cardos is an important breeding site for the species in southern Brazil.
Resumo:
ECER 2015 "Education and Transition - Contributions from Educational Research", Corvinus University of Budapest from 7 to 11 September 2015.
Resumo:
Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.
Resumo:
Over the last two decades the research and development of legged locomotion robots has grown steadily. Legged systems present major advantages when compared with ‘traditional’ vehicles, because they allow locomotion in inaccessible terrain to vehicles with wheels and tracks. However, the robustness of legged robots, and especially their energy consumption, among other aspects, still lag behind mechanisms that use wheels and tracks. Therefore, in the present state of development, there are several aspects that need to be improved and optimized. Keeping these ideas in mind, this paper presents the review of the literature of different methods adopted for the optimization of the structure and locomotion gaits of walking robots. Among the distinct possible strategies often used for these tasks are referred approaches such as the mimicking of biological animals, the use of evolutionary schemes to find the optimal parameters and structures, the adoption of sound mechanical design rules, and the optimization of power-based indexes.
Resumo:
This paper suggests that the thought of the North-American critical theorist James W. Carey provides a relevant perspective on communication and technology. Having as background American social pragmatism and progressive thinkers of the beginning of the 20th century (as Dewey, Mead, Cooley, and Park), Carey built a perspective that brought together the political economy of Harold A. Innis, the social criticism of David Riesman and Charles W. Mills and incorporated Marxist topics such as commodification and sociocultural domination. The main goal of this paper is to explore the connection established by Carey between modern technological communication and what he called the “transmissive model”, a model which not only reduces the symbolic process of communication to instrumentalization and to information delivery, but also politically converges with capitalism as well as power, control and expansionist goals. Conceiving communication as a process that creates symbolic and cultural systems, in which and through which social life takes place, Carey gives equal emphasis to the incorporation processes of communication.If symbolic forms and culture are ways of conditioning action, they are also influenced by technological and economic materializations of symbolic systems, and by other conditioning structures. In Carey’s view, communication is never a disembodied force; rather, it is a set of practices in which co-exist conceptions, techniques and social relations. These practices configure reality or, alternatively, can refute, transform and celebrate it. Exhibiting sensitiveness favourable to the historical understanding of communication, media and information technologies, one of the issues Carey explored most was the history of the telegraph as an harbinger of the Internet, of its problems and contradictions. For Carey, Internet was seen as the contemporary heir of the communications revolution triggered by the prototype of transmission technologies, namely the telegraph in the 19th century. In the telegraph Carey saw the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of conflict of interest for the control over patents; an inducer of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. After a brief approach to Carey’s communication theory, this paper focuses on his seminal essay "Technology and ideology. The case of the telegraph", bearing in mind the prospect of the communication revolution introduced by Internet. We maintain that this essay has seminal relevance for critically studying the information society. Our reading of it highlights the reach, as well as the problems, of an approach which conceives the innovation of the telegraph as a metaphor for all innovations, announcing the modern stage of history and determining to this day the major lines of development in modern communication systems.
Resumo:
In the history of modern communication, after the development of the printing press, the telegraph unleashed a revolution in communications. Today, Internet is in many ways its heir. Reflections on the telegraph may open up perspectives concerning tendencies, possibilities and pitfalls of the Internet. The telegraph has been well explored in important literature on communication and media which tends to emphasize the history of this technology, its social context and institutional meaning [e.g. Robert L. Thompson, 1947, Tom Standage, 2007 [1998]. James W. Carey, the North- American critical cultural studies' mentor, in his essay "Technology and Ideology. The Case of the Telegraph" (2009 [1983]), suggests a distinctive approach. In the telegraph, Carey sees the prototype of many subsequent commercial empires based on science and technology, a pioneer model for complex business management; an example of interest struggle for the patents control; an inductor of changes both in language and in structures of knowledge; and a promoter of a futurist and utopian thought of information technologies. Having in mind a revolution in communications promoted by the Internet, this paper revisits this seminal essay to explore its great attainment, as well as the problems of this kind of approach which conceives the innovation of the telegraph as a metaphor for all the innovations announcing the modern stage of history and determining still today the major lines of development in modern communication systems.
Resumo:
BACKGROUND: Examining changes in brain activation linked with emotion-inducing stimuli is essential to the study of emotions. Due to the ecological potential of techniques such as virtual reality (VR), inspection of whether brain activation in response to emotional stimuli can be modulated by the three-dimensional (3D) properties of the images is important. OBJECTIVE: The current study sought to test whether the activation of brain areas involved in the emotional processing of scenarios of different valences can be modulated by 3D. Therefore, the focus was made on the interaction effect between emotion-inducing stimuli of different emotional valences (pleasant, unpleasant and neutral valences) and visualization types (2D, 3D). However, main effects were also analyzed.METHODS: The effect of emotional valence and visualization types and their interaction were analyzed through a 3x2 repeated measures ANOVA. Post-hoc t-tests were performed under a ROI-analysis approach. RESULTS: The results show increased brain activation for the 3D affective-inducing stimuli in comparison with the same stimuli in 2D scenarios, mostly in cortical and subcortical regions that are related to emotional processing, in addition to visual processing regions. CONCLUSIONS: This study has the potential of clarify brain mechanisms involved in the processing of emotional stimuli (scenarios’ valence) and their interaction with three-dimensionality.
Resumo:
It has been shown that in reality at least two general scenarios of data structuring are possible: (a) a self-similar (SS) scenario when the measured data form an SS structure and (b) a quasi-periodic (QP) scenario when the repeated (strongly correlated) data form random sequences that are almost periodic with respect to each other. In the second case it becomes possible to describe their behavior and express a part of their randomness quantitatively in terms of the deterministic amplitude–frequency response belonging to the generalized Prony spectrum. This possibility allows us to re-examine the conventional concept of measurements and opens a new way for the description of a wide set of different data. In particular, it concerns different complex systems when the ‘best-fit’ model pretending to be the description of the data measured is absent but the barest necessity of description of these data in terms of the reduced number of quantitative parameters exists. The possibilities of the proposed approach and detection algorithm of the QP processes were demonstrated on actual data: spectroscopic data recorded for pure water and acoustic data for a test hole. The suggested methodology allows revising the accepted classification of different incommensurable and self-affine spatial structures and finding accurate interpretation of the generalized Prony spectroscopy that includes the Fourier spectroscopy as a partial case.