997 resultados para spectral line intensity
Resumo:
Summary Ecotones are sensitive to change because they contain high numbers of species living at the margin of their environmental tolerance. This is equally true of tree-lines, which are determined by attitudinal or latitudinal temperature gradients. In the current context of climate change, they are expected to undergo modifications in position, tree biomass and possibly species composition. Attitudinal and latitudinal tree-lines differ mainly in the steepness of the underlying temperature gradient: distances are larger at latitudinal tree-lines, which could have an impact on the ability of tree species to migrate in response to climate change. Aside from temperature, tree-lines are also affected on a more local level by pressure from human activities. These are also changing as a consequence of modifications in our societies and may interact with the effects of climate change. Forest dynamics models are often used for climate change simulations because of their mechanistic processes. The spatially-explicit model TreeMig was used as a base to develop a model specifically tuned for the northern European and Alpine tree-line ecotones. For the latter, a module for land-use change processes was also added. The temperature response parameters for the species in the model were first calibrated by means of tree-ring data from various species and sites at both tree-lines. This improved the growth response function in the model, but also lead to the conclusion that regeneration is probably more important than growth for controlling tree-line position and species' distributions. The second step was to implement the module for abandonment of agricultural land in the Alps, based on an existing spatial statistical model. The sensitivity of its most important variables was tested and the model's performance compared to other modelling approaches. The probability that agricultural land would be abandoned was strongly influenced by the distance from the nearest forest and the slope, bath of which are proxies for cultivation costs. When applied to a case study area, the resulting model, named TreeMig-LAb, gave the most realistic results. These were consistent with observed consequences of land-abandonment such as the expansion of the existing forest and closing up of gaps. This new model was then applied in two case study areas, one in the Swiss Alps and one in Finnish Lapland, under a variety of climate change scenarios. These were based on forecasts of temperature change over the next century by the IPCC and the HadCM3 climate model (ΔT: +1.3, +3.5 and +5.6 °C) and included a post-change stabilisation period of 300 years. The results showed radical disruptions at both tree-lines. With the most conservative climate change scenario, species' distributions simply shifted, but it took several centuries reach a new equilibrium. With the more extreme scenarios, some species disappeared from our study areas (e.g. Pinus cembra in the Alps) or dwindled to very low numbers, as they ran out of land into which they could migrate. The most striking result was the lag in the response of most species, independently from the climate change scenario or tree-line type considered. Finally, a statistical model of the effect of reindeer (Rangifer tarandus) browsing on the growth of Pinus sylvestris was developed, as a first step towards implementing human impacts at the boreal tree-line. The expected effect was an indirect one, as reindeer deplete the ground lichen cover, thought to protect the trees against adverse climate conditions. The model showed a small but significant effect of browsing, but as the link with the underlying climate variables was unclear and the model was not spatial, it was not usable as such. Developing the TreeMig-LAb model allowed to: a) establish a method for deriving species' parameters for the growth equation from tree-rings, b) highlight the importance of regeneration in determining tree-line position and species' distributions and c) improve the integration of social sciences into landscape modelling. Applying the model at the Alpine and northern European tree-lines under different climate change scenarios showed that with most forecasted levels of temperature increase, tree-lines would suffer major disruptions, with shifts in distributions and potential extinction of some tree-line species. However, these responses showed strong lags, so these effects would not become apparent before decades and could take centuries to stabilise. Résumé Les écotones son sensibles au changement en raison du nombre élevé d'espèces qui y vivent à la limite de leur tolérance environnementale. Ceci s'applique également aux limites des arbres définies par les gradients de température altitudinaux et latitudinaux. Dans le contexte actuel de changement climatique, on s'attend à ce qu'elles subissent des modifications de leur position, de la biomasse des arbres et éventuellement des essences qui les composent. Les limites altitudinales et latitudinales diffèrent essentiellement au niveau de la pente des gradients de température qui les sous-tendent les distance sont plus grandes pour les limites latitudinales, ce qui pourrait avoir un impact sur la capacité des espèces à migrer en réponse au changement climatique. En sus de la température, la limite des arbres est aussi influencée à un niveau plus local par les pressions dues aux activités humaines. Celles-ci sont aussi en mutation suite aux changements dans nos sociétés et peuvent interagir avec les effets du changement climatique. Les modèles de dynamique forestière sont souvent utilisés pour simuler les effets du changement climatique, car ils sont basés sur la modélisation de processus. Le modèle spatialement explicite TreeMig a été utilisé comme base pour développer un modèle spécialement adapté pour la limite des arbres en Europe du Nord et dans les Alpes. Pour cette dernière, un module servant à simuler des changements d'utilisation du sol a également été ajouté. Tout d'abord, les paramètres de la courbe de réponse à la température pour les espèces inclues dans le modèle ont été calibrées au moyen de données dendrochronologiques pour diverses espèces et divers sites des deux écotones. Ceci a permis d'améliorer la courbe de croissance du modèle, mais a également permis de conclure que la régénération est probablement plus déterminante que la croissance en ce qui concerne la position de la limite des arbres et la distribution des espèces. La seconde étape consistait à implémenter le module d'abandon du terrain agricole dans les Alpes, basé sur un modèle statistique spatial existant. La sensibilité des variables les plus importantes du modèle a été testée et la performance de ce dernier comparée à d'autres approches de modélisation. La probabilité qu'un terrain soit abandonné était fortement influencée par la distance à la forêt la plus proche et par la pente, qui sont tous deux des substituts pour les coûts liés à la mise en culture. Lors de l'application en situation réelle, le nouveau modèle, baptisé TreeMig-LAb, a donné les résultats les plus réalistes. Ceux-ci étaient comparables aux conséquences déjà observées de l'abandon de terrains agricoles, telles que l'expansion des forêts existantes et la fermeture des clairières. Ce nouveau modèle a ensuite été mis en application dans deux zones d'étude, l'une dans les Alpes suisses et l'autre en Laponie finlandaise, avec divers scénarios de changement climatique. Ces derniers étaient basés sur les prévisions de changement de température pour le siècle prochain établies par l'IPCC et le modèle climatique HadCM3 (ΔT: +1.3, +3.5 et +5.6 °C) et comprenaient une période de stabilisation post-changement climatique de 300 ans. Les résultats ont montré des perturbations majeures dans les deux types de limites de arbres. Avec le scénario de changement climatique le moins extrême, les distributions respectives des espèces ont subi un simple glissement, mais il a fallu plusieurs siècles pour qu'elles atteignent un nouvel équilibre. Avec les autres scénarios, certaines espèces ont disparu de la zone d'étude (p. ex. Pinus cembra dans les Alpes) ou ont vu leur population diminuer parce qu'il n'y avait plus assez de terrains disponibles dans lesquels elles puissent migrer. Le résultat le plus frappant a été le temps de latence dans la réponse de la plupart des espèces, indépendamment du scénario de changement climatique utilisé ou du type de limite des arbres. Finalement, un modèle statistique de l'effet de l'abroutissement par les rennes (Rangifer tarandus) sur la croissance de Pinus sylvestris a été développé, comme première étape en vue de l'implémentation des impacts humains sur la limite boréale des arbres. L'effet attendu était indirect, puisque les rennes réduisent la couverture de lichen sur le sol, dont on attend un effet protecteur contre les rigueurs climatiques. Le modèle a mis en évidence un effet modeste mais significatif, mais étant donné que le lien avec les variables climatiques sous jacentes était peu clair et que le modèle n'était pas appliqué dans l'espace, il n'était pas utilisable tel quel. Le développement du modèle TreeMig-LAb a permis : a) d'établir une méthode pour déduire les paramètres spécifiques de l'équation de croissance ä partir de données dendrochronologiques, b) de mettre en évidence l'importance de la régénération dans la position de la limite des arbres et la distribution des espèces et c) d'améliorer l'intégration des sciences sociales dans les modèles de paysage. L'application du modèle aux limites alpines et nord-européennes des arbres sous différents scénarios de changement climatique a montré qu'avec la plupart des niveaux d'augmentation de température prévus, la limite des arbres subirait des perturbations majeures, avec des glissements d'aires de répartition et l'extinction potentielle de certaines espèces. Cependant, ces réponses ont montré des temps de latence importants, si bien que ces effets ne seraient pas visibles avant des décennies et pourraient mettre plusieurs siècles à se stabiliser.
Resumo:
Signaling through the Notch1 receptor is essential for the control of numerous developmental processes during embryonic life as well as in adult tissue homeostasis and disease. Since the outcome of Notch1 signaling is highly context-dependent, and its precise physiological and pathological role in many organs is unclear, it is of great interest to localize and identify the cells that receive active Notch1 signals in vivo. Here, we report the generation and characterization of a BAC-transgenic mouse line, N1-Gal4VP16, that when crossed to a Gal4-responsive reporter mouse line allowed the identification of cells undergoing active Notch1 signaling in vivo. Analysis of embryonic and adult N1-Gal4VP16 mice demonstrated that the activation pattern of the transgene coincides with previously observed activation patterns of the endogenous Notch1 receptor. Thus, this novel reporter mouse line provides a unique tool to specifically investigate the spatial and temporal aspects of Notch1 signaling in vivo. genesis 50:700-710, 2012. © 2012 Wiley Periodicals, Inc.
Resumo:
The soil surface roughness increases water retention and infiltration, reduces the runoff volume and speed and influences soil losses by water erosion. Similarly to other parameters, soil roughness is affected by the tillage system and rainfall volume. Based on these assumptions, the main purpose of this study was to evaluate the effect of tillage treatments on soil surface roughness (RR) and tortuosity (T) and to investigate the relationship with soil and water losses in a series of simulated rainfall events. The field study was carried out at the experimental station of EMBRAPA Southeastern Cattle Research Center in São Carlos (Fazenda Canchim), in São Paulo State, Brazil. Experimental plots of 33 m² were treated with two tillage practices in three replications, consisting of: untilled (no-tillage) soil (NTS) and conventionally tilled (plowing plus double disking) soil (CTS). Three successive simulated rain tests were applied in 24 h intervals. The three tests consisted of a first rain of 30 mm/h, a second of 30 mm/h and a third rain of 70 mm/h. Immediately after tilling and each rain simulation test, the surface roughness was measured, using a laser profile meter. The tillage treatments induced significant changes in soil surface roughness and tortuosity, demonstrating the importance of the tillage system for the physical surface conditions, favoring water retention and infiltration in the soil. The increase in surface roughness by the tillage treatments was considerably greater than its reduction by rain action. The surface roughness and tortuosity had more influence on the soil volume lost by surface runoff than in the conventional treatment. Possibly, other variables influenced soil and water losses from the no-tillage treatments, e.g., soil type, declivity, slope length, among others not analyzed in this study.
Resumo:
We present computational approaches as alternatives to a recent microwave cavity experiment by S. Sridhar and A. Kudrolli [Phys. Rev. Lett. 72, 2175 (1994)] on isospectral cavities built from triangles. A straightforward proof of isospectrality is given, based on the mode-matching method. Our results show that the experiment is accurate to 0.3% for the first 25 states. The level statistics resemble those of a Gaussian orthogonal ensemble when the integrable part of the spectrum is removed.
Resumo:
An instrument designed to measure thermal conductivity of consolidated rocks, dry or saturated, using a transient method is presented. The instrument measures relative values of the thermal conductivity, and it needs calibration to obtain absolute values. The device can be used as heat pulse line source and as continuous heat line source. Two parameters to determine thermal conductivity are proposed: TMAX, in heat pulse line source, and SLOPE, in continuous heat line source. Its performance is better, and the operation simpler, in heat pulse line-source mode with a measuring time of 170 s and a reproducibility better than 2.5%. The sample preparation is very simple on both modes. The performance has been tested with a set of ten rocks with thermal conductivity values between 1.4 and 5.2 W m¿1 K¿1 which covers the usual range for consolidated rocks.
Resumo:
The intensity correlation functions C(t) for the colored-gain-noise model of dye lasers are analyzed and compared with those for the loss-noise model. For correlation times ¿ larger than the deterministic relaxation time td, we show with the use of the adiabatic approximation that C(t) values coincide for both models. For small correlation times we use a method that provides explicit expressions of non-Markovian correlation functions, approximating simultaneously short- and long-time behaviors. Comparison with numerical simulations shows excellent results simultaneously for short- and long-time regimes. It is found that, when the correlation time of the noise increases, differences between the gain- and loss-noise models tend to disappear. The decay of C(t) for both models can be described by a time scale that approaches the deterministic relaxation time. However, in contrast with the loss-noise model, a secondary time scale remains for large times for the gain-noise model, which could allow one to distinguish between both models.
Resumo:
BACKGROUND: Poor tolerance and adverse drug reactions are main reasons for discontinuation of antiretroviral therapy (ART). Identifying predictors of ART discontinuation is a priority in HIV care. METHODS: A genetic association study in an observational cohort to evaluate the association of pharmacogenetic markers with time to treatment discontinuation during the first year of ART. Analysis included 577 treatment-naive individuals initiating tenofovir (n = 500) or abacavir (n = 77), with efavirenz (n = 272), lopinavir/ritonavir (n = 184), or atazanavir/ritonavir (n = 121). Genotyping included 23 genetic markers in 15 genes associated with toxicity or pharmacokinetics of the study medication. Rates of ART discontinuation between groups with and without genetic risk markers were assessed by survival analysis using Cox regression models. RESULTS: During the first year of ART, 190 individuals (33%) stopped 1 or more drugs. For efavirenz and atazanavir, individuals with genetic risk markers experienced higher discontinuation rates than individuals without (71.15% vs 28.10%, and 62.5% vs 14.6%, respectively). The efavirenz discontinuation hazard ratio (HR) was 3.14 (95% confidence interval (CI): 1.35-7.33, P = .008). The atazanavir discontinuation HR was 9.13 (95% CI: 3.38-24.69, P < .0001). CONCLUSIONS: Several pharmacogenetic markers identify individuals at risk for early treatment discontinuation. These markers should be considered for validation in the clinical setting.
Resumo:
General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.
Resumo:
BACKGROUND: Sorafenib (Sb) is a multiple kinase inhibitor targeting both tumour cell proliferation and angiogenesis that may further act as a potent radiosensitizer by arresting cells in the most radiosensitive cell cycle phase. This phase I open-label, noncontrolled dose escalation study was performed to determine the safety and maximum tolerated dose (MTD) of Sb in combination with radiation therapy (RT) and temozolomide (TMZ) in 17 patients with newly diagnosed high-grade glioma. METHODS: Patients were treated with RT (60 Gy in 2 Gy fractions) combined with TMZ 75 mg m(-2) daily, and Sb administered at three dose levels (200 mg daily, 200 mg BID, and 400 mg BID) starting on day 8 of RT. Thirty days after the end of RT, patients received monthly TMZ (150-200 mg m(-2) D1-5/28) and Sb (400 mg BID). Pharmacokinetic (PK) analyses were performed on day 8 (TMZ) and on day 21 (TMZ&Sb) (Clinicaltrials ID: NCT00884416). RESULTS: The MTD of Sb was established at 200 mg BID. Dose-limiting toxicities included thrombocytopenia (two patients), diarrhoea (one patient) and hypercholesterolaemia (one patient). Sb administration did not affect the mean area under the curve(0-24) and mean Cmax of TMZ and its metabolite 5-amino-imidazole-4-carboxamide (AIC). Tmax of both TMZ and AIC was delayed from 0.75 (TMZ alone) to 1.5 h (combined TMZ/Sb). The median progression-free survival was 7.9 months (95% confidence interval (CI): 5.4-14.55), and the median overall survival was 17.8 months (95% CI: 14.7-25.6). CONCLUSIONS: Although Sb can be combined with RT and TMZ, significant side effects and moderate outcome results do not support further clinical development in malignant gliomas. The robust PK data of the TMZ/Sb combination could be useful in other cancer settings.
Resumo:
PURPOSE Updated results are presented after a median follow-up of 7.3 years from the phase III First-Line Indolent Trial of yttrium-90 ((90)Y) -ibritumomab tiuxetan in advanced-stage follicular lymphoma (FL) in first remission. PATIENTS AND METHODS Patients with CD20(+) stage III or IV FL with complete response (CR), unconfirmed CR (CRu), or partial response (PR) after first-line induction treatment were randomly assigned to (90)Y-ibritumomab consolidation therapy (rituximab 250 mg/m(2) days -7 and 0, then (90)Y-ibritumomab 14.8 MBq/kg day 0; maximum 1,184 MBq) or no further treatment (control). Primary end point was progression-free survival (PFS) from date of random assignment. Results For 409 patients available for analysis ((90)Y-ibritumomab, n = 207; control, n = 202), estimated 8-year overall PFS was 41% with (90)Y-ibritumomab versus 22% for control (hazard ratio [HR], 0.47; P < .001). For patients in CR/CRu after induction, 8-year PFS with (90)Y-ibritumomab was 48% versus 32% for control (HR, 0.61; P = .008), and for PR patients, it was 33% versus 10% (HR, 0.38; P < .001). For (90)Y-ibritumomab consolidation, median PFS was 4.1 years (v 1.1 years for control; P < .001). Median time to next treatment (TTNT) was 8.1 years for (90)Y-ibritumomab versus 3.0 years for control (P < .001) with approximately 80% response rates to second-line therapy in either arm, including autologous stem-cell transplantation. No unexpected toxicities emerged during long-term follow-up. Estimated between-group 8-year overall survival rates were similar. Annualized incidence rate of myelodysplastic syndrome/acute myeloblastic leukemia was 0.50% versus 0.07% in (90)Y-ibritumomab and control groups, respectively (P = .042). CONCLUSION (90)Y-ibritumomab consolidation after achieving PR or CR/CRu to induction confers 3-year benefit in median PFS with durable 19% PFS advantage at 8 years and improves TTNT by 5.1 years for patients with advanced FL.
Resumo:
In multiobject pattern recognition the height of the correlation peaks should be controlled when the power spectrum of ajoint transform correlator is binarized. In this paper a method to predetermine the value of detection peaks is demonstrated. The technique is based on a frequency-variant threshold in order to remove the intraclass terms and on a suitable factor to normalize the binary joint power spectrum. Digital simulations and experimental hybrid implementation of this method were carried out.
Resumo:
Surface topography and light scattering were measured on 15 samples ranging from those having smooth surfaces to others with ground surfaces. The measurement techniques included an atomic force microscope, mechanical and optical profilers, confocal laser scanning microscope, angle-resolved scattering, and total scattering. The samples included polished and ground fused silica, silicon carbide, sapphire, electroplated gold, and diamond-turned brass. The measurement instruments and techniques had different surface spatial wavelength band limits, so the measured roughnesses were not directly comparable. Two-dimensional power spectral density (PSD) functions were calculated from the digitized measurement data, and we obtained rms roughnesses by integrating areas under the PSD curves between fixed upper and lower band limits. In this way, roughnesses measured with different instruments and techniques could be directly compared. Although smaller differences between measurement techniques remained in the calculated roughnesses, these could be explained mostly by surface topographical features such as isolated particles that affected the instruments in different ways.
Resumo:
A 6N-dimensional alternative formulation is proposed for constrained Hamiltonian systems. In this context the noninteraction theorem is derived from the world-line conditions. A model of two interacting particles is exhibited where physical coordinates are canonical.