971 resultados para three-shell model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary Throughout my thesis, I elaborate on how real and financing frictions affect corporate decision making under uncertainty, and I explore how firms time their investment and financing decisions given such frictions. While the macroeconomics literature has focused on the impact of real frictions on investment decisions assuming all equity financed firms, the financial economics literature has mainly focused on the study of financing frictions. My thesis therefore assesses the join interaction of real and financing frictions in firms' dynamic investment and financing decisions. My work provides a rationale for the documented poor empirical performance of neoclassical investment models based on the joint effect of real and financing frictions on investment. A major observation relies in how the infrequency of corporate decisions may affect standard empirical tests. My thesis suggests that the book to market sorts commonly used in the empirical asset pricing literature have economic content, as they control for the lumpiness in firms' optimal investment policies. My work also elaborates on the effects of asymmetric information and strategic interaction on firms' investment and financing decisions. I study how firms time their decision to raise public equity when outside investors lack information about their future investment prospects. I derive areal-options model that predicts either cold or hot markets for new stock issues conditional on adverse selection, and I provide a rational approach to study jointly the market timing of corporate decisions and announcement effects in stock returns. My doctoral dissertation therefore contributes to our understanding of how under real and financing frictions may bias standard empirical tests, elaborates on how adverse selection may induce hot and cold markets in new issues' markets, and suggests how the underlying economic behaviour of firms may induce alternative patterns in stock prices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Departures from pure self interest in economic experiments have recently inspired models of "social preferences". We conduct experiments on simple two-person and three-person games with binary choices that test these theories more directly than the array of games conventionally considered. Our experiments show strong support for the prevalence of "quasi-maximin" preferences: People sacrifice to increase the payoffs for all recipients, but especially for the lowest-payoff recipients. People are also motivated by reciprocity: While people are reluctant to sacrifice to reciprocate good or bad behavior beyond what they would sacrifice for neutral parties, they withdraw willingness to sacrifice to achieve a fair outcome when others are themselves unwilling to sacrifice. Some participants are averse to getting different payoffs than others, but based on our experiments and reinterpretation of previous experiments we argue that behavior that has been presented as "difference aversion" in recent papers is actually a combination of reciprocal and quasi-maximin motivations. We formulate a model in which each player is willing to sacrifice to allocate the quasi-maximin allocation only to those players also believed to be pursuing the quasi-maximin allocation, and may sacrifice to punish unfair players.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies assessing skin irritation to chemicals have traditionally used laboratory animals; however, such methods are questionable regarding their relevance for humans. New in vitro methods have been validated, such as the reconstructed human epidermis (RHE) model (Episkin®, Epiderm®). The comparison (accuracy) with in vivo results such as the 4-h human patch test (HPT) is 76% at best (Epiderm®). There is a need to develop an in vitro method that better simulates the anatomo-pathological changes encountered in vivo. To develop an in vitro method to determine skin irritation using human viable skin through histopathology, and compare the results of 4 tested substances to the main in vitro methods and in vivo animal method (Draize test). Human skin removed during surgery was dermatomed and mounted on an in vitro flow-through diffusion cell system. Ten chemicals with known non-irritant (heptylbutyrate, hexylsalicylate, butylmethacrylate, isoproturon, bentazon, DEHP and methylisothiazolinone (MI)) and irritant properties (folpet, 1-bromohexane and methylchloroisothiazolinone (MCI/MI)), a negative control (sodiumchloride) and a positive control (sodiumlaurylsulphate) were applied. The skin was exposed at least for 4h. Histopathology was performed to investigate irritation signs (spongiosis, necrosis, vacuolization). We obtained 100% accuracy with the HPT model; 75% with the RHE models and 50% with the Draize test for 4 tested substances. The coefficients of variation (CV) between our three test batches were <0.1, showing good reproducibility. Furthermore, we reported objectively histopathological irritation signs (irritation scale): strong (folpet), significant (1-bromohexane), slight (MCI/MI at 750/250ppm) and none (isoproturon, bentazon, DEHP and MI). This new in vitro test method presented effective results for the tested chemicals. It should be further validated using a greater number of substances; and tested in different laboratories in order to suitably evaluate reproducibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species' geographic ranges are usually considered as basic units in macroecology and biogeography, yet it is still difficult to measure them accurately for many reasons. About 20 years ago, researchers started using local data on species' occurrences to estimate broad scale ranges, thereby establishing the niche modeling approach. However, there are still many problems in model evaluation and application, and one of the solutions is to find a consensus solution among models derived from different mathematical and statistical models for niche modeling, climatic projections and variable combination, all of which are sources of uncertainty during niche modeling. In this paper, we discuss this approach of ensemble forecasting and propose that it can be divided into three phases with increasing levels of complexity. Phase I is the simple combination of maps to achieve a consensual and hopefully conservative solution. In Phase II, differences among the maps used are described by multivariate analyses, and Phase III consists of the quantitative evaluation of the relative magnitude of uncertainties from different sources and their mapping. To illustrate these developments, we analyzed the occurrence data of the tiger moth, Utetheisa ornatrix (Lepidoptera, Arctiidae), a Neotropical moth species, and modeled its geographic range in current and future climates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An impaired glutathione (GSH) synthesis was observed in several multifactorial diseases, including schizophrenia and myocardial infarction. Genetic studies revealed an association between schizophrenia and a GAG trinucleotide repeat (TNR) polymorphism in the catalytic subunit (GCLC) of the glutamate cysteine ligase (GCL). Disease-associated genotypes of this polymorphism correlated with a decrease in GCLC protein expression, GCL activity and GSH content. To clarify consequences of a decreased GCL activity at the proteome level, three schizophrenia patients and three controls have been selected based on the GCLC GAG TNR polymorphism. Fibroblast cultures were obtained by skin biopsy and were challenged with tert-butylhydroquinone (t-BHQ), a substance known to induce oxidative stress. Proteome changes were analyzed by two dimensional gel electrophoresis (2-DE) and results revealed 10 spots that were upregulated in patients following t-BHQ treatment, but not in controls. Nine corresponding proteins could be identified by MALDI mass spectrometry and these proteins are involved in various cellular functions, including energy metabolism, oxidative stress response, and cytoskeletal reorganization. In conclusion, skin fibroblasts of subjects with an impaired GSH synthesis showed an altered proteome reaction in response to oxidative stress. Furthermore, the study corroborates the use of fibroblasts as an additional mean to study vulnerability factors of psychiatric diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper characterizes the relationship between entrepreneurial wealth and aggregate investmentunder adverse selection. Its main finding is that such a relationship need not bemonotonic. In particular, three results emerge from the analysis: (i) pooling equilibria, in whichinvestment is independent of entrepreneurial wealth, are more likely to arise when entrepreneurialwealth is relatively low; (ii) separating equilibria, in which investment is increasing inentrepreneurial wealth, are most likely to arise when entrepreneurial wealth is relatively highand; (iii) for a given interest rate, an increase in entrepreneurial wealth may generate a discontinuousfall in investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How did Europe escape the "Iron Law of Wages?" We construct a simple Malthusian model withtwo sectors and multiple steady states, and use it to explain why European per capita incomes andurbanization rates increased during the period 1350-1700. Productivity growth can only explain a smallfraction of the rise in output per capita. Population dynamics changes of the birth and death schedules were far more important determinants of steady states. We show how a major shock to population cantrigger a transition to a new steady state with higher per-capita income. The Black Death was such ashock, raising wages substantially. Because of Engel's Law, demand for urban products increased, andurban centers grew in size. European cities were unhealthy, and rising urbanization pushed up aggregatedeath rates. This effect was reinforced by diseases spread through war, financed by higher tax revenues.In addition, rising trade also spread diseases. In this way higher wages themselves reduced populationpressure. We show in a calibration exercise that our model can account for the sustained rise in Europeanurbanization as well as permanently higher per capita incomes in 1700, without technological change.Wars contributed importantly to the "Rise of Europe", even if they had negative short-run effects. We thustrace Europe s precocious rise to economic riches to interactions of the plague shock with the belligerentpolitical environment and the nature of cities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a method for brain atlas deformation in the presence of large space-occupying tumors, based on an a priori model of lesion growth that assumes radial expansion of the lesion from its starting point. Our approach involves three steps. First, an affine registration brings the atlas and the patient into global correspondence. Then, the seeding of a synthetic tumor into the brain atlas provides a template for the lesion. The last step is the deformation of the seeded atlas, combining a method derived from optical flow principles and a model of lesion growth. Results show that a good registration is performed and that the method can be applied to automatic segmentation of structures and substructures in brains with gross deformation, with important medical applications in neurosurgery, radiosurgery, and radiotherapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: A new caval tree system was designed for realistic in vitro simulation. The objective of our study was to assess cannula performance for virtually wall-less versus standard percutaneous thin-walled venous cannulas in a setting of venous collapse in case of negative pressure. METHODS: For a collapsible caval model, a very flexible plastic material was selected, and a model with nine afferent veins was designed according to the anatomy of the vena cava. A flow bench was built including a lower reservoir holding the caval tree, built by taking into account the main afferent vessels and their flow provided by a reservoir 6 cm above. A cannula was inserted in this caval tree and connected to a centrifugal pump that, in turn, was connected to a reservoir positioned 83 cm above the second lower reservoir (after-load = 60 mmHg). Using the same pre-load, the simulated venous drainage for cardiopulmonary bypass was realized using a 24 F wall-less cannula (Smartcanula) and 25 F percutaneous cannula (Biomedicus), and stepwise increased augmentation (1500 RPM, 2000 and 2500 RPM) of venous drainage. RESULTS: For the thin wall and the wall-less cannulas, 36 pairs of flow and pressure measurements were realized for three different RPM values. The mean Q-values at 1500, 2000 and 2500 RPM were: 3.98 ± 0.01, 6.27 ± 0.02 and 9.81 ± 0.02 l/min for the wall-less cannula (P <0.0001), versus 2.74 ± 0.02, 3.06 ± 0.05, 6.78 ± 0.02 l/min for the thin-wall cannula (P <0.0001). The corresponding inlet pressure values were: -8.88 ± 0.01, -23.69 ± 0.81 and -70.22 ± 0.18 mmHg for the wall-less cannula (P <0.0001), versus -36.69 ± 1.88, -80.85 ± 1.71 and -101.83 ± 0.45 mmHg for the thin-wall cannula (P <0.0001). The thin-wall cannula showed mean Q-values 37% less and mean P values 26% more when compared with the wall-less cannula (P <0.0001). CONCLUSIONS: Our in vitro water test was able to mimic a negative pressure situation, where the wall-less cannula design performs better compared with the traditional thin-wall cannula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preface In this thesis we study several questions related to transaction data measured at an individual level. The questions are addressed in three essays that will constitute this thesis. In the first essay we use tick-by-tick data to estimate non-parametrically the jump process of 37 big stocks traded on the Paris Stock Exchange, and of the CAC 40 index. We separate the total daily returns in three components (trading continuous, trading jump, and overnight), and we characterize each one of them. We estimate at the individual and index levels the contribution of each return component to the total daily variability. For the index, the contribution of jumps is smaller and it is compensated by the larger contribution of overnight returns. We test formally that individual stocks jump more frequently than the index, and that they do not respond independently to the arrive of news. Finally, we find that daily jumps are larger when their arrival rates are larger. At the contemporaneous level there is a strong negative correlation between the jump frequency and the trading activity measures. The second essay study the general properties of the trade- and volume-duration processes for two stocks traded on the Paris Stock Exchange. These two stocks correspond to a very illiquid stock and to a relatively liquid stock. We estimate a class of autoregressive gamma process with conditional distribution from the family of non-central gamma (up to a scale factor). This process was introduced by Gouriéroux and Jasiak and it is known as Autoregressive gamma process. We also evaluate the ability of the process to fit the data. For this purpose we use the Diebold, Gunther and Tay (1998) test; and the capacity of the model to reproduce the moments of the observed data, and the empirical serial correlation and the partial serial correlation functions. We establish that the model describes correctly the trade duration process of illiquid stocks, but have problems to adjust correctly the trade duration process of liquid stocks which present long-memory characteristics. When the model is adjusted to volume duration, it successfully fit the data. In the third essay we study the economic relevance of optimal liquidation strategies by calibrating a recent and realistic microstructure model with data from the Paris Stock Exchange. We distinguish the case of parameters which are constant through the day from time-varying ones. An optimization problem incorporating this realistic microstructure model is presented and solved. Our model endogenizes the number of trades required before the position is liquidated. A comparative static exercise demonstrates the realism of our model. We find that a sell decision taken in the morning will be liquidated by the early afternoon. If price impacts increase over the day, the liquidation will take place more rapidly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The predictive potential of six selected factors was assessed in 72 patients with primary myelodysplastic syndrome using univariate and multivariate logistic regression analysis of survival at 18 months. Factors were age (above median of 69 years), dysplastic features in the three myeloid bone marrow cell lineages, presence of chromosome defects, all metaphases abnormal, double or complex chromosome defects (C23), and a Bournemouth score of 2, 3, or 4 (B234). In the multivariate approach, B234 and C23 proved to be significantly associated with a reduction in the survival probability. The similarity of the regression coefficients associated with these two factors means that they have about the same weight. Consequently, the model was simplified by counting the number of factors (0, 1, or 2) present in each patient, thus generating a scoring system called the Lausanne-Bournemouth score (LB score). The LB score combines the well-recognized and easy-to-use Bournemouth score (B score) with the chromosome defect complexity, C23 constituting an additional indicator of patient outcome. The predicted risk of death within 18 months calculated from the model is as follows: 7.1% (confidence interval: 1.7-24.8) for patients with an LB score of 0, 60.1% (44.7-73.8) for an LB score of 1, and 96.8% (84.5-99.4) for an LB score of 2. The scoring system presented here has several interesting features. The LB score may improve the predictive value of the B score, as it is able to recognize two prognostic groups in the intermediate risk category of patients with B scores of 2 or 3. It has also the ability to identify two distinct prognostic subclasses among RAEB and possibly CMML patients. In addition to its above-described usefulness in the prognostic evaluation, the LB score may bring new insights into the understanding of evolution patterns in MDS. We used the combination of the B score and chromosome complexity to define four classes which may be considered four possible states of myelodysplasia and which describe two distinct evolutional pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The criterion, based on the thermodynamics theory, that the climatic system tends to extremizesome function has suggested several studies. In particular, special attention has been devoted to the possibility that the climate reaches an extremal rate of planetary entropy production.Due to both radiative and material effects contribute to total planetary entropy production,climatic simulations obtained at the extremal rates of total, radiative or material entropy production appear to be of interest in order to elucidate which of the three extremal assumptions behaves more similar to current data. In the present paper, these results have been obtainedby applying a 2-dimensional (2-Dim) horizontal energy balance box-model, with a few independent variables (surface temperature, cloud-cover and material heat fluxes). In addition, climatic simulations for current conditions by assuming a fixed cloud-cover have been obtained. Finally,sensitivity analyses for both variable and fixed cloud models have been carried out

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to study the diffusion and transformation of scientific information in everyday discussions. Based on rumour models and social representations theory, the impact of interpersonal communication and pre-existing beliefs on transmission of the content of a scientific discovery was analysed. In three experiments, a communication chain was simulated to investigate how laypeople make sense of a genetic discovery first published in a scientific outlet, then reported in a mainstream newspaper and finally discussed in groups. Study 1 (N=40) demonstrated a transformation of information when the scientific discovery moved along the communication chain. During successive narratives, scientific expert terminology disappeared while scientific information associated with lay terminology persisted. Moreover, the idea of a discovery of a faithfulness gene emerged. Study 2 (N=70) revealed that transmission of the scientific message varied as a function of attitudes towards genetic explanations of behaviour (pro-genetics vs. anti-genetics). Pro-genetics employed more scientific terminology than anti-genetics. Study 3 (N=75) showed that endorsement of genetic explanations was related to descriptive accounts of the scientific information, whereas rejection of genetic explanations was related to evaluative accounts of the information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.