929 resultados para Adsorption. Zeolite 13X. Langmuir model. Dynamic modeling. Pyrolysis of sewage sludge
Resumo:
Marx and the writers that followed him have produced a number of theories of the breakdown of capitalism. The majority of these theories were based on the historical tendencies: the rise in the composition of capital and the share of capital and the fall in the rate of profit. However these theories were never modeled with main stream rigour. This paper presents a constant wage model, with capital, labour and land as factors of production, which reproduces the historical tendencies and so can be used as a foundation for the various theories. The use of Chaplygins theorem in the proof of the main result also gives the paper a technical interest.
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
The determination of characteristic cardiac parameters, such as displacement, stress and strain distribution are essential for an understanding of the mechanics of the heart. The calculation of these parameters has been limited until recently by the use of idealised mathematical representations of biventricular geometries and by applying simple material laws. On the basis of 20 short axis heart slices and in consideration of linear and nonlinear material behaviour we have developed a FE model with about 100,000 degrees of freedom. Marching Cubes and Phong's incremental shading technique were used to visualise the three dimensional geometry. In a quasistatic FE analysis continuous distribution of regional stress and strain corresponding to the endsystolic state were calculated. Substantial regional variation of the Von Mises stress and the total strain energy were observed at all levels of the heart model. The results of both the linear elastic model and the model with a nonlinear material description (Mooney-Rivlin) were compared. While the stress distribution and peak stress values were found to be comparable, the displacement vectors obtained with the nonlinear model were generally higher in comparison with the linear elastic case indicating the need to include nonlinear effects.
Resumo:
Chemokines (chemoattractant cytokines) induce potent and selective chemotaxis of leukocyte subsets in vitro. Here, we review briefly the chemokines shown to induce eosinophil chemotaxis in vitro and describe a novel model for the study of the ability of chemokines to stimulate eosinophil migration in vivo. Eosinophils were purified from the blood of mice over-expressing the IL-5 gene and labelled with 111In. Only the C-C chemokines, eotaxin and MIP-1alpha, but not RANTES, MCP-1, MCP-3, MCP-4, MIP-1ß, KC and MIP-2, effectively induced the recruitment of 111In-eosinophils in mouse skin. We suggest that this mouse model will be useful in assessing the role of endogenously-generated chemokines in mediating eosinophil migration to sites of allergic inflammation in vivo.
Resumo:
This summary report follows on from the publication of the Northern Ireland physical activity strategy in 1996 and the subsequent publication of the strategy action plan in 1998. Within this strategy action plan a recommendation was made for the health sector, that research should be carried out to evaluate and compare the cost of investing in physical activity programmes against the cost of treating preventable illness. To help in the development of this key area, the Department of Health, Social Services and Public Safety's Economics Branch agreed to develop a model that would seek to establish the extent of avoidable deaths from physical inactivity and, as a consequence, the avoidable economic and healthcare costs for Northern Ireland.
Resumo:
Every year, debris flows cause huge damage in mountainous areas. Due to population pressure in hazardous zones, the socio-economic impact is much higher than in the past. Therefore, the development of indicative susceptibility hazard maps is of primary importance, particularly in developing countries. However, the complexity of the phenomenon and the variability of local controlling factors limit the use of processbased models for a first assessment. A debris flow model has been developed for regional susceptibility assessments using digital elevation model (DEM) with a GIS-based approach.. The automatic identification of source areas and the estimation of debris flow spreading, based on GIS tools, provide a substantial basis for a preliminary susceptibility assessment at a regional scale. One of the main advantages of this model is its workability. In fact, everything is open to the user, from the data choice to the selection of the algorithms and their parameters. The Flow-R model was tested in three different contexts: two in Switzerland and one in Pakistan, for indicative susceptibility hazard mapping. It was shown that the quality of the DEM is the most important parameter to obtain reliable results for propagation, but also to identify the potential debris flows sources.
Resumo:
This paper aims to provide empirical support for the use of the principal-agent framework in the analysis of public sector and public policies. After reviewing the different conditions to be met for a relevant analysis of the relationship between population and government using the principal-agent theory, our paper focuses on the assumption of conflicting goals between the principal and the agent. A principal-agent analysis assumes in effect that inefficiencies may arise because principal and agent pursue different goals. Using data collected during an amalgamation project of two Swiss municipalities, we show the existence of a gap between the goals of the population and those of the government. Consequently, inefficiencies as predicted by the principal-agent model may arise during the implementation of a public policy, i.e. an amalgamation project. In a context of direct democracy where policies are regularly subjected to referendum, the conflict of objectives may even lead to a total failure of the policy at the polls.
Resumo:
We present a study of the continuous-time equations governing the dynamics of a susceptible infected-susceptible model on heterogeneous metapopulations. These equations have been recently proposed as an alternative formulation for the spread of infectious diseases in metapopulations in a continuous-time framework. Individual-based Monte Carlo simulations of epidemic spread in uncorrelated networks are also performed revealing a good agreement with analytical predictions under the assumption of simultaneous transmission or recovery and migration processes
Resumo:
BACKGROUND: Physicians need a specific risk-stratification tool to facilitate safe and cost-effective approaches to the management of patients with cancer and acute pulmonary embolism (PE). The objective of this study was to develop a simple risk score for predicting 30-day mortality in patients with PE and cancer by using measures readily obtained at the time of PE diagnosis. METHODS: Investigators randomly allocated 1,556 consecutive patients with cancer and acute PE from the international multicenter Registro Informatizado de la Enfermedad TromboEmbólica to derivation (67%) and internal validation (33%) samples. The external validation cohort for this study consisted of 261 patients with cancer and acute PE. Investigators compared 30-day all-cause mortality and nonfatal adverse medical outcomes across the derivation and two validation samples. RESULTS: In the derivation sample, multivariable analyses produced the risk score, which contained six variables: age > 80 years, heart rate ≥ 110/min, systolic BP < 100 mm Hg, body weight < 60 kg, recent immobility, and presence of metastases. In the internal validation cohort (n = 508), the 22.2% of patients (113 of 508) classified as low risk by the prognostic model had a 30-day mortality of 4.4% (95% CI, 0.6%-8.2%) compared with 29.9% (95% CI, 25.4%-34.4%) in the high-risk group. In the external validation cohort, the 18% of patients (47 of 261) classified as low risk by the prognostic model had a 30-day mortality of 0%, compared with 19.6% (95% CI, 14.3%-25.0%) in the high-risk group. CONCLUSIONS: The developed clinical prediction rule accurately identifies low-risk patients with cancer and acute PE.
Resumo:
ABSTRACT: Ultramarathons comprise any sporting event involving running longer than the traditional marathon length of 42.195 km (26.2 miles). Studies on ultramarathon participants can investigate the acute consequences of ultra-endurance exercise on inflammation and cardiovascular or renal consequences, as well as endocrine/energetic aspects, and examine the tissue recovery process over several days of extreme physical load. In a study published in BMC Medicine, Schütz et al. followed 44 ultramarathon runners over 4,487 km from South Italy to North Cape, Norway (the Trans Europe Foot Race 2009) and recorded daily sets of data from magnetic resonance imaging, psychometric, body composition and biological measurements. The findings will allow us to better understand the timecourse of degeneration/regeneration of some lower leg tissues such as knee joint cartilage, to differentiate running-induced from age-induced pathologies (for example, retropatelar arthritis) and finally to assess the interindividual susceptibility to injuries. Moreover, it will also provide new information about the complex interplay between cerebral adaptations/alterations and hormonal influences resulting from endurance exercise and provide data on the dose-response relationship between exercise and brain structure/function. Overall, this study represents a unique attempt to investigate the limits of the adaptive response of human bodies.Please see related article: http://www.biomedcentral.com/1741-7015/10/78.
Resumo:
Dynamic Nuclear Polarization (DNP) is an emerging technique that could revolutionize the NMR study of small molecules at very low concentrations by the increase in sensitivity that results from transfer of polarization between electronic and nuclear spins. Although the underlying physics has been known for a long time, in the last few years there has been a lot of excitement on the chemistry and biology NMR community caused by the demonstration that the highly polarized nuclei that are prepared in solid state at very low temperatures (1-2 K) could be rapidly transferred to liquid samples at room temperature and studied in solution by conventional NMR techniques. In favorable cases several order of magnitude increases in sensitivity have been achieved. The technique is now mature enough that a commercial instrument is available. The efficiency of DNP depends on two crucial aspects: i) the efficiency of the nuclear polarization process and ii) the efficiency of the transfer from the initial solid state to the fluid state in which NMR is measured. The preferred areas of application (iii) will be dictated by situations in which the low concentration of the sample or its intrinsic low receptivity are the limiting factors .
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.