977 resultados para dynamic light scatter
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.
Resumo:
This paper argues that the strategic use of debt favours the revelationof information in dynamic adverse selection problems. Our argument is basedon the idea that debt is a credible commitment to end long term relationships.Consequently, debt encourages a privately informed party to disclose itsinformation at early stages of a relationship. We illustrate our pointwith the financing decision of a monopolist selling a good to a buyerwhose valuation is private information. A high level of (renegotiable)debt, by increasing the scope for liquidation, may induce the highvaluation buyer to buy early at a high price and thus increase themonopolist's expected payoff. By affecting the buyer's strategy, it mayreduce the probability of excessive liquidation. We investigate theconsequences of good durability and we examine the way debt mayalleviate the ratchet effect.
Resumo:
The interaction of a parasite and a host cell is a complex process, which involves several steps: (1) attachment to the plasma membrane, (2) entry inside the host cell, and (3) hijacking of the metabolism of the host. In biochemical experiments, only an event averaged over the whole cell population can be analyzed. The power of microscopy, however, is to investigate individual events in individual cells. Therefore, parasitologists frequently perform experiments with fluorescence microscopy using different dyes to label structures of the parasite or the host cell. Though the resolution of light microscopy has greatly improved, it is not sufficient to reveal interactions at the ultrastructural level. Furthermore, only specifically labeled structures can be seen and related to each other. Here, we want to demonstrate the additional value of electron microscopy in this area of research. Investigation of the different steps of parasite-host cell interaction by electron microscopy, however, is often hampered by the fact that there are only a few cells infected, and therefore it is difficult to find enough cells to study. A solution is to profit from low magnification, hence large overview, and specific location of the players by fluorescence labels in a light microscope with the high power resolution and structural information provided by an electron microscope, in short by correlative light and electron microscopy.
Resumo:
Esta dissertação constitui um estudo de caso, exploratório e de carácter descritivo. Tem como objectivo fazer uma radiografia sociolinguística de Cabo Verde, particularmente centrada no actual uso das duas línguas faladas no arquipélago, o português (PCV) e o crioulo cabo-verdiano (LCV). A linha de pesquisa adoptada inspira-se fortemente nos estudos de macrosociolinguística, tomando a situação de contacto de línguas, caracterizadora da sociedade cabo-verdiana, como ponto de partida para o enquadramento de um conjunto de questões seleccionadas para investigação mais aprofundada. São, assim, explorados os processos implicados nesta situação de contacto concreta e os resultados linguísticos decorrentes da mesma, como o bilinguismo ou a diglossia (cf. Cap. 3). Recorrendo a contributos teóricos de áreas associadas e complementares (cf. Cap. 1), foca-se a importância da análise dos domínios em que cada uma das línguas é usada, das redes sociais dos falantes ou das suas atitudes linguísticas. A investigação partiu de uma recolha de dados realizada para o efeito, nas nove ilhas habitadas. As unidades de análise retidas correspondem a uma amostragem de dois grupos sociais distintos: falantes jovens, alunos do ensino secundário (inquiridos por questionário), e falantes adultos cuja profissão implica uma intensa actividade linguística (professores e ‘líderes’, inquiridos por entrevistas semi-dirigidas). Foi usada uma metodologia de recolha o mais rigorosa possível e adoptado o tratamento estatístico de dados (cf. Cap. 2). O confronto dos comportamentos linguísticos e das atitudes das duas gerações inquiridas, com diferentes características (cf. Introdução), forneceu importantes informações sobre a dinâmica linguística da sociedade cabo-verdiana, conclusões essas que serão importantes para a definição de orientações no âmbito da política linguística (cf. Cap. 5). É apresentada, como complemento, uma análise exploratória de alguns aspectos sintácticos atestados nas produções dos indivíduos inquiridos com instrução superior (cf. Cap. 4), um contributo, embora modesto, para a definição da variedade padrão do PCV
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
We study the interaction between insurance and capital markets within singlebut general framework.We show that capital markets greatly enhance the risksharing capacity of insurance markets and the scope of risks that areinsurable because efficiency does not depend on the number of agents atrisk, nor on risks being independent, nor on the preferences and endowmentsof agents at risk being the same. We show that agents share risks by buyingfull coverage for their individual risks and provide insurance capitalthrough stock markets.We show that aggregate risk enters private insuranceas positive loading on insurance prices and despite that agents will buyfull coverage. The loading is determined by the risk premium of investorsin the stock market and hence does not depend on the agent s willingnessto pay. Agents provide insurance capital by trading an equally weightedportfolio of insurance company shares and riskless asset. We are able toconstruct agents optimal trading strategies explicitly and for verygeneral preferences.
Resumo:
The well--known Minkowski's? $(x)$ function is presented as the asymptotic distribution function of an enumeration of the rationals in (0,1] based on their continued fraction representation. Besides, the singularity of ?$(x)$ is clearly proved in two ways: by exhibiting a set of measure one in which ?ï$(x)$ = 0; and again by actually finding a set of measure one which is mapped onto a set of measure zero and viceversa. These sets are described by means of metrical properties of different systems for real number representation.
Resumo:
In models where privately informed agents interact, agents may need to formhigher order expectations, i.e. expectations of other agents' expectations. This paper develops a tractable framework for solving and analyzing linear dynamic rational expectationsmodels in which privately informed agents form higher order expectations. The frameworkis used to demonstrate that the well-known problem of the infinite regress of expectationsidentified by Townsend (1983) can be approximated to an arbitrary accuracy with a finitedimensional representation under quite general conditions. The paper is constructive andpresents a fixed point algorithm for finding an accurate solution and provides weak conditions that ensure that a fixed point exists. To help intuition, Singleton's (1987) asset pricingmodel with disparately informed traders is used as a vehicle for the paper.
Resumo:
Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.
Resumo:
Albitization is a common process during which hydrothermal fluids convert plagioclase and/or K-feldspar into nearly pure albite; however, its specific mechanism in granitoids is not well understood. The c. 1700 Ma A-type metaluminous ferroan granites in the Khetri complex of Rajasthan, NW India, have been albitized to a large extent by two metasomatic fronts, an initial transformation of oligoclase to nearly pure albite and a subsequent replacement of microcline by albite, with sharp contacts between the microcline-bearing and microcline-free zones. Albitization has bleached the original pinkish grey granite and turned it white. The mineralogical changes include transformation of oligoclase (similar to An(12)) and microcline (similar to Or(95)) to almost pure albite (similar to An(0 center dot 5-2)), amphibole from potassian ferropargasite (X-Fe 0 center dot 84-0 center dot 86) to potassic hastingsite (X-Fe 0 center dot 88-0 center dot 97) and actinolite (X-Fe 0 center dot 32-0 center dot 67), and biotite from annite (X-Fe 0 center dot 71-0 center dot 74) to annite (X-Fe 0 center dot 90-0 center dot 91). Whole-rock isocon diagrams show that, during albitization, the granites experienced major hydration, slight gain in Si and major gain in Na, whereas K, Mg, Fe and Ca were lost along with Rb, Ba, Sr, Zn, light rare earth elements and U. Whole-rock Sm-Nd isotope data plot on an apparent isochron of 1419 +/- 98 Ma and reveal significant disturbance and at least partial resetting of the intrusion age. Severe scatter in the whole-rock Rb-Sr isochron plot reflects the extreme Rb loss in the completely albitized samples, effectively freezing Sr-87/Sr-86 ratios in the albite granites at very high values (0 center dot 725-0 center dot 735). This indicates either infiltration of highly radiogenic Sr from the country rock or, more likely, radiogenic ingrowth during a considerable time lag (estimated to be at least 300 Myr) between original intrusion and albitization. The albitization took place at similar to 350-400 degrees C. It was caused by the infiltration of an ascending hydrothermal fluid that had acquired high Na/K and Na/Ca ratios during migration through metamorphic rocks at even lower temperatures in the periphery of the plutons. Oxygen isotope ratios increase from delta O-18 = 7 parts per thousand in the original granite to values of 9-10 parts per thousand in completely albitized samples, suggesting that the fluid had equilibrated with surrounding metamorphosed crust. A metasomatic model, using chromatographic theory of fluid infiltration, explains the process for generating the observed zonation in terms of a leading metasomatic front where oligoclase of the original granite is converted to albite, and a second, trailing front where microcline is also converted to albite. The temperature gradients driving the fluid infiltration may have been produced by the high heat production of the granites themselves. The confinement of the albitized granites along the NE-SW-trending Khetri lineament and the pervasive nature of the albitization suggest that the albitizing fluids possibly originated during reactivation of the lineament. More generally, steady-state temperature gradients induced by the high internal heat production of A-type granites may provide the driving force for similar metasomatic and ore-forming processes in other highly enriched granitoid bodies.
Resumo:
Purpose: To investigate how prior-to-injury and usual alcohol consumption relate to time of injury. Patients and methods: The associations between injury time of day and day of week and prior-to-injury (labeled as "acute") alcohol intake and hazardous usual alcohol consumption (considered from the point of view of both heavy episodic drinking [HED] and risky volumes of consumption) are assessed using interview data from a randomized sample of 486 injured patients treated in a Swiss emergency department (ED; Lausanne University Hospital). Results: Acute consumption was associated with both injury time of day and day of week, HED with day of week only, and risky volume with none. Conclusions: Acute consumption and HED, but not risky volume of consumption, show specific time distributions for injuries. These findings highlight the potential importance of considering the time dimension of an injury when providing emergency care and have additional implications for interventions aimed at influencing the alcohol consumption of injured patients presenting to the ED.
Resumo:
Acute brain slices are slices of brain tissue that are kept vital in vitro for further recordings and analyses. This tool is of major importance in neurobiology and allows the study of brain cells such as microglia, astrocytes, neurons and their inter/intracellular communications via ion channels or transporters. In combination with light/fluorescence microscopies, acute brain slices enable the ex vivo analysis of specific cells or groups of cells inside the slice, e.g. astrocytes. To bridge ex vivo knowledge of a cell with its ultrastructure, we developed a correlative microscopy approach for acute brain slices. The workflow begins with sampling of the tissue and precise trimming of a region of interest, which contains GFP-tagged astrocytes that can be visualised by fluorescence microscopy of ultrathin sections. The astrocytes and their surroundings are then analysed by high resolution scanning transmission electron microscopy (STEM). An important aspect of this workflow is the modification of a commercial cryo-ultramicrotome to observe the fluorescent GFP signal during the trimming process. It ensured that sections contained at least one GFP astrocyte. After cryo-sectioning, a map of the GFP-expressing astrocytes is established and transferred to correlation software installed on a focused ion beam scanning electron microscope equipped with a STEM detector. Next, the areas displaying fluorescence are selected for high resolution STEM imaging. An overview area (e.g. a whole mesh of the grid) is imaged with an automated tiling and stitching process. In the final stitched image, the local organisation of the brain tissue can be surveyed or areas of interest can be magnified to observe fine details, e.g. vesicles or gold labels on specific proteins. The robustness of this workflow is contingent on the quality of sample preparation, based on Tokuyasu's protocol. This method results in a reasonable compromise between preservation of morphology and maintenance of antigenicity. Finally, an important feature of this approach is that the fluorescence of the GFP signal is preserved throughout the entire preparation process until the last step before electron microscopy.
Resumo:
This paper extends existing insurance results on the type of insurance contracts needed for insurance market efficiency toa dynamic setting. It introduces continuosly open markets that allow for more efficient asset allocation. It alsoeliminates the role of preferences and endowments in the classification of risks, which is done primarily in terms of the actuarial properties of the underlying riskprocess. The paper further extends insurability to include correlated and catstrophic events. Under these very general conditions the paper defines a condition that determines whether a small number of standard insurance contracts (together with aggregate assets) suffice to complete markets or one needs to introduce such assets as mutual insurance.
Resumo:
We incorporate the process of enforcement learning by assuming that the agency's current marginal cost is a decreasing function of its past experience of detecting and convicting. The agency accumulates data and information (on criminals, on opportunities of crime) enhancing the ability to apprehend in the future at a lower marginal cost.We focus on the impact of enforcement learning on optimal stationary compliance rules. In particular, we show that the optimal stationary fine could be less-than-maximal and the optimal stationary probability of detection could be higher-than-otherwise.