953 resultados para What-if Analysis
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The purpose of this study was to investigate if experimental alloxanic diabetes could cause qualitative changes in intestinal anastomoses of the terminal ileum and distal colon in rats, as compared to controls. 192 male Wistar rats, weighing ± 300g were split into four experimental groups of 48 animals each, after 3 months of follow-up: a control group with ileum anastomoses (G1), a control group with colon anastomoses (G2), a diabetic group with ileum anastomoses (G3) and a diabetic group with colon anastomoses (G4). Animals were evaluated and sacrificed on days 4, 14, 21 and 30 after surgery, and fragments of the small and large intestine where the anastomoses were performed were removed. Samples from 6 animals from each sacrifice moment were submitted to ultrastructural analysis of the collagen fibers using a scanning electron microscope and samples from another 6 animals were submitted to histopathology and optical microscopy studies using picrosirius red-staining. Histopathological analysis of picrosirius red-stained anastomosis slides using an optical microscope at 40x magnification showed that the distribution of collagen fibers was disarranged and also revealed a delay in scar tissue retraction. The morphometric study revealed differences in the collagen filled area for the ileum anastomoses 14 days post surgery whereas, in the case of colon anastomoses, differences were observed at days 4 and 30 post surgery, with higher values in the diabetic animals. Ultrastructure analysis of the ileum and colon anastomoses using a scanning electron microscope revealed fewer wide collagen fibers, the presence of narrower fibers and a disarranged distribution of the collagen fibers. We conclude that diabetes caused qualitative changes in scar tissue as well as in the structural arrangement of collagen fibers, what could explain the reduced wound strength in the anastomosis of diabetic animals. © J. A. Barth Verlag in Georg Thieme Verlag KG Stuttgart.
Resumo:
Includes bibliography
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
"The problems that exist in the world today cannot be solved by the level of thinking we were at when we created them." - Albert Einstein What is higher education? It is a key tool to our gaining the level of thinking Einstein describes - the level of thinking needed to solve the problems that exist in the world today. I am not saying higher education is the only thing that prepares people to think - far from it. But higher education, if it does what it is meant to do, prepares us with a solid base of skills to think critically and analytically to deal successfully with an ever-changing world. It instills in us the desire and ability to be lifelong learners, able to grow and participate as members of both local and global communities. Learning skills, critical analysis skills, skills that allow us to deal with and be successful in the ever-changing world around us are the skills higher education must provide.
Resumo:
We investigated the seasonal patterns of Amazonian forest photosynthetic activity, and the effects thereon of variations in climate and land-use, by integrating data from a network of ground-based eddy flux towers in Brazil established as part of the ‘Large-Scale Biosphere Atmosphere Experiment in Amazonia’ project. We found that degree of water limitation, as indicated by the seasonality of the ratio of sensible to latent heat flux (Bowen ratio) predicts seasonal patterns of photosynthesis. In equatorial Amazonian forests (5◦ N–5◦ S), water limitation is absent, and photosynthetic fluxes (or gross ecosystem productivity, GEP) exhibit high or increasing levels of photosynthetic activity as the dry season progresses, likely a consequence of allocation to growth of new leaves. In contrast, forests along the southern flank of the Amazon, pastures converted from forest, and mixed forest-grass savanna, exhibit dry-season declines in GEP, consistent with increasing degrees of water limitation. Although previous work showed tropical ecosystem evapotranspiration (ET) is driven by incoming radiation, GEP observations reported here surprisingly show no or negative relationships with photosynthetically active radiation (PAR). Instead, GEP fluxes largely followed the phenology of canopy photosynthetic capacity (Pc), with only deviations from this primary pattern driven by variations in PAR. Estimates of leaf flush at three
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
In this dissertation the pyrolytic conversion of biomass into chemicals and fuels was investigated from the analytical point of view. The study was focused on the liquid (bio-oil) and solid (char) fractions obtainable from biomass pyrolysis. The drawbacks of Py-GC-MS described so far were partially solved by coupling different analytical configurations (Py-GC-MS, Py-GC-MIP-AED and off-line Py-SPE and Py-SPME-GC-MS with derivatization procedures). The application of different techniques allowed a satisfactory comparative analysis of pyrolysis products of different biomass and a high throughput screening on effect of 33 catalysts on biomass pyrolysis. As the results of the screening showed, the most interesting catalysts were those containing copper (able to reduce the high molecular weight fraction of bio-oil without large yield decrease) and H-ZSM-5 (able to entirely convert the bio-oil into “gasoline like” aromatic products). In order to establish the noxious compounds content of the liquid product, a clean-up step was included in the Py-SPE procedure. This allowed to investigate pollutants (PAHs) generation from pyrolysis and catalytic pyrolysis of biomass. In fact, bio-oil from non-catalytic pyrolysis of biomass showed a moderate PAHs content, while the use of H-ZSM-5 catalyst for bio-oil up-grading determined an astonishing high production of PAHs (if compared to what observed in alkanes cracking), indicating an important concern in the substitution fossil fuel with bio-oil derived from biomass. Moreover, the analytical procedures developed in this thesis were directly applied for the detailed study of the most useful process scheme and up-grading route to chemical intermediates (anhydrosugars), transportation fuels or commodity chemicals (aromatic hydrocarbons). In the applied study, poplar and microalgae biomass were investigated and overall GHGs balance of pyrolysis of agricultural residues in Ravenna province was performed. A special attention was put on the comparison of the effect of bio-char different use (fuel or as soil conditioner) on the soil health and GHGs emissions.
Resumo:
Animal neocentromeres are defined as ectopic centromeres that have formed in non-centromeric locations and avoid some of the features, like the DNA satellite sequence, that normally characterize canonical centromeres. Despite this, they are stable functional centromeres inherited through generations. The only existence of neocentromeres provide convincing evidence that centromere specification is determined by epigenetic rather than sequence-specific mechanisms. For all this reasons, we used them as simplified models to investigate the molecular mechanisms that underlay the formation and the maintenance of functional centromeres. We collected human cell lines carrying neocentromeres in different positions. To investigate the region involved in the process at the DNA sequence level we applied a recent technology that integrates Chromatin Immuno-Precipitation and DNA microarrays (ChIP-on-chip) using rabbit polyclonal antibodies directed against CENP-A or CENP-C human centromeric proteins. These DNA binding-proteins are required for kinetochore function and are exclusively targeted to functional centromeres. Thus, the immunoprecipitation of DNA bound by these proteins allows the isolation of centromeric sequences, including those of the neocentromeres. Neocentromeres arise even in protein-coding genes region. We further analyzed if the increased scaffold attachment sites and the corresponding tighter chromatin of the region involved in the neocentromerization process still were permissive or not to transcription of within encoded genes. Centromere repositioning is a phenomenon in which a neocentromere arisen without altering the gene order, followed by the inactivation of the canonical centromere, becomes fixed in population. It is a process of chromosome rearrangement fundamental in evolution, at the bases of speciation. The repeat-free region where the neocentromere initially forms, progressively acquires extended arrays of satellite tandem repeats that may contribute to its functional stability. In this view our attention focalized to the repositioned horse ECA11 centromere. ChIP-on-chip analysis was used to define the region involved and SNPs studies, mapping within the region involved into neocentromerization, were carried on. We have been able to describe the structural polymorphism of the chromosome 11 centromeric domain of Caballus population. That polymorphism was seen even between homologues chromosome of the same cells. That discovery was the first described ever. Genomic plasticity had a fundamental role in evolution. Centromeres are not static packaged region of genomes. The key question that fascinates biologists is to understand how that centromere plasticity could be combined to the stability and maintenance of centromeric function. Starting from the epigenetic point of view that underlies centromere formation, we decided to analyze the RNA content of centromeric chromatin. RNA, as well as secondary chemically modifications that involve both histones and DNA, represents a good candidate to guide somehow the centromere formation and maintenance. Many observations suggest that transcription of centromeric DNA or of other non-coding RNAs could affect centromere formation. To date has been no thorough investigation addressing the identity of the chromatin-associated RNAs (CARs) on a global scale. This prompted us to develop techniques to identify CARs in a genome-wide approach using high-throughput genomic platforms. The future goal of this study will be to focalize the attention on what strictly happens specifically inside centromere chromatin.
Resumo:
Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?
Resumo:
BACKGROUND: Mechanical pain sensitivity is assessed in every patient with pain, either by palpation or by quantitative pressure algometry. Despite widespread use, no studies have formally addressed the usefulness of this practice for the identification of the source of pain. We tested the hypothesis that assessing mechanical pain sensitivity distinguishes damaged from healthy cervical zygapophysial (facet) joints. METHODS: Thirty-three patients with chronic unilateral neck pain were studied. Pressure pain thresholds (PPTs) were assessed bilaterally at all cervical zygapophysial joints. The diagnosis of zygapophysial joint pain was made by selective nerve blocks. Primary analysis was the comparison of the PPT between symptomatic and contralateral asymptomatic joints. The secondary end points were as follows: differences in PPT between affected and asymptomatic joints of the same side of patients with zygapophysial joint pain; differences in PPT at the painful side between patients with and without zygapophysial joint pain; and sensitivity and specificity of PPT for 2 different cutoffs (difference in PPT between affected and contralateral side by 1 and 30 kPa, meaning that the test was considered positive if the difference in PPT between painful and contralateral side was negative by at least 1 and 30 kPa, respectively). The PPT of patients was also compared with the PPT of 12 pain-free subjects. RESULTS: Zygapophysial joint pain was present in 14 patients. In these cases, the difference in mean PPT between affected and contralateral side (primary analysis) was −6.2 kPa (95% confidence interval: −19.5 to 7.2, P = 0.34). In addition, the secondary analyses yielded no statistically significant differences. For the cutoff of 1 kPa, sensitivity and specificity of PPT were 67% and 16%, respectively, resulting in a positive likelihood ratio of 0.79 and a diagnostic confidence of 38%. When the cutoff of 30 kPa was considered, the sensitivity decreased to only 13%, whereas the specificity increased to 95%, resulting in a positive likelihood ratio of 2.53 and a diagnostic confidence of 67%. The PPT was significantly lower in patients than in pain-free subjects (P < 0.001). CONCLUSIONS: Assessing mechanical pain sensitivity is not diagnostic for cervical zygapophysial joint pain. The finding should stimulate further research into a diagnostic tool that is widely used in the clinical examination of patients with pain.
Resumo:
In this dissertation, the National Survey of Student Engagement (NSSE) serves as a nodal point through which to examine the power relations shaping the direction and practices of higher education in the twenty-first century. Theoretically, my analysis is informed by Foucault’s concept of governmentality, briefly defined as a technology of power that influences or shapes behavior from a distance. This form of governance operates through apparatuses of security, which include higher education. Foucault identified three essential characteristics of an apparatus—the market, the milieu, and the processes of normalization—through which administrative mechanisms and practices operate and govern populations. In this project, my primary focus is on the governance of faculty and administrators, as a population, at residential colleges and universities. I argue that the existing milieu of accountability is one dominated by the neoliberal assumption that all activity—including higher education—works best when governed by market forces alone, reducing higher education to a market-mediated private good. Under these conditions, what many in the academy believe is an essential purpose of higher education—to educate students broadly, to contribute knowledge for the public good, and to serve as society’s critic and social conscience (Washburn 227)—is being eroded. Although NSSE emerged as a form of resistance to commercial college rankings, it did not challenge the forces that empowered the rankings in the first place. Indeed, NSSE data are now being used to make institutions even more responsive to market forces. Furthermore, NSSE’s use has a normalizing effect that tends to homogenize classroom practices and erode the autonomy of faculty in the educational process. It also positions students as part of the system of surveillance. In the end, if aspects of higher education that are essential to maintaining a civil society are left to be defined solely in market terms, the result may be a less vibrant and, ultimately, a less just society.
Resumo:
The work described in this thesis had two objectives. The first objective was to develop a physically based computational model that could be used to predict the electronic conductivity, Seebeck coefficient, and thermal conductivity of Pb1-xSnxTe alloys over the 400 K to 700 K temperature as a function of Sn content and doping level. The second objective was to determine how the secondary phase inclusions observed in Pb1-xSnxTe alloys made by consolidating mechanically alloyed elemental powders impact the ability of the material to harvest waste heat and generate electricity in the 400 K to 700 K temperature range. The motivation for this work was that though the promise of this alloy as an unusually efficient thermoelectric power generator material in the 400 K to 700 K range had been demonstrated in the literature, methods to reproducibly control and subsequently optimize the materials thermoelectric figure of merit remain elusive. Mechanical alloying, though not typically used to fabricate these alloys, is a potential method for cost-effectively engineering these properties. Given that there are deviations from crystalline perfection in mechanically alloyed material such as secondary phase inclusions, the question arises as to whether these defects are detrimental to thermoelectric function or alternatively, whether they enhance thermoelectric function of the alloy. The hypothesis formed at the onset of this work was that the small secondary phase SnO2 inclusions observed to be present in the mechanically alloyed Pb1-xSnxTe would increase the thermoelectric figure of merit of the material over the temperature range of interest. It was proposed that the increase in the figure of merit would arise because the inclusions in the material would not reduce the electrical conductivity to as great an extent as the thermal conductivity. If this were to be true, then the experimentally measured electronic conductivity in mechanically alloyed Pb1-xSnxTe alloys that have these inclusions would not be less than that expected in alloys without these inclusions while the portion of the thermal conductivity that is not due to charge carriers (the lattice thermal conductivity) would be less than what would be expected from alloys that do not have these inclusions. Furthermore, it would be possible to approximate the observed changes in the electrical and thermal transport properties using existing physical models for the scattering of electrons and phonons by small inclusions. The approach taken to investigate this hypothesis was to first experimentally characterize the mobile carrier concentration at room temperature along with the extent and type of secondary phase inclusions present in a series of three mechanically alloyed Pb1-xSnxTe alloys with different Sn content. Second, the physically based computational model was developed. This model was used to determine what the electronic conductivity, Seebeck coefficient, total thermal conductivity, and the portion of the thermal conductivity not due to mobile charge carriers would be in these particular Pb1-xSnxTe alloys if there were to be no secondary phase inclusions. Third, the electronic conductivity, Seebeck coefficient and total thermal conductivity was experimentally measured for these three alloys with inclusions present at elevated temperatures. The model predictions for electrical conductivity and Seebeck coefficient were directly compared to the experimental elevated temperature electrical transport measurements. The computational model was then used to extract the lattice thermal conductivity from the experimentally measured total thermal conductivity. This lattice thermal conductivity was then compared to what would be expected from the alloys in the absence of secondary phase inclusions. Secondary phase inclusions were determined by X-ray diffraction analysis to be present in all three alloys to a varying extent. The inclusions were found not to significantly degrade electrical conductivity at temperatures above ~ 400 K in these alloys, though they do dramatically impact electronic mobility at room temperature. It is shown that, at temperatures above ~ 400 K, electrons are scattered predominantly by optical and acoustical phonons rather than by an alloy scattering mechanism or the inclusions. The experimental electrical conductivity and Seebeck coefficient data at elevated temperatures were found to be within ~ 10 % of what would be expected for material without inclusions. The inclusions were not found to reduce the lattice thermal conductivity at elevated temperatures. The experimentally measured thermal conductivity data was found to be consistent with the lattice thermal conductivity that would arise due to two scattering processes: Phonon phonon scattering (Umklapp scattering) and the scattering of phonons by the disorder induced by the formation of a PbTe-SnTe solid solution (alloy scattering). As opposed to the case in electrical transport, the alloy scattering mechanism in thermal transport is shown to be a significant contributor to the total thermal resistance. An estimation of the extent to which the mean free time between phonon scattering events would be reduced due to the presence of the inclusions is consistent with the above analysis of the experimental data. The first important result of this work was the development of an experimentally validated, physically based computational model that can be used to predict the electronic conductivity, Seebeck coefficient, and thermal conductivity of Pb1-xSnxTe alloys over the 400 K to 700 K temperature as a function of Sn content and doping level. This model will be critical in future work as a tool to first determine what the highest thermoelectric figure of merit one can expect from this alloy system at a given temperature and, second, as a tool to determine the optimum Sn content and doping level to achieve this figure of merit. The second important result of this work is the determination that the secondary phase inclusions that were observed to be present in the Pb1-xSnxTe made by mechanical alloying do not keep the material from having the same electrical and thermal transport that would be expected from “perfect" single crystal material at elevated temperatures. The analytical approach described in this work will be critical in future investigations to predict how changing the size, type, and volume fraction of secondary phase inclusions can be used to impact thermal and electrical transport in this materials system.