890 resultados para Exponential isotropy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The classical theory of collision induced emission (CIE) from pairs of dissimilar rare gas atoms was developed in Paper I [D. Reguera and G. Birnbaum, J. Chem. Phys. 125, 184304 (2006)] from a knowledge of the straight line collision trajectory and the assumption that the magnitude of the dipole could be represented by an exponential function of the inter-nuclear distance. This theory is extended here to deal with other functional forms of the induced dipole as revealed by ab initio calculations. Accurate analytical expression for the CIE can be obtained by least square fitting of the ab initio values of the dipole as a function of inter-atomic separation using a sum of exponentials and then proceeding as in Paper I. However, we also show how the multi-exponential fit can be replaced by a simpler fit using only two analytic functions. Our analysis is applied to the polar molecules HF and HBr. Unlike the rare gas atoms considered previously, these atomic pairs form stable bound diatomic molecules. We show that, interestingly, the spectra of these reactive molecules are characterized by the presence of multiple peaks. We also discuss the CIE arising from half collisions in excited electronic states, which in principle could be probed in photo-dissociation experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent mineralogical studies on archaeological pottery samples report significant variations in alkali metal concentrations due to environmental alterations during burial. Here we examine the effects of potassium (K) leaching on luminescence dating. The effect on the estimation of the dose rate is studied by considering four models of leaching (exponential, linear, early and late) and their impact on fine- and coarse-grain dating are calculated. The modeling approaches are applied to two cases of pottery in which evidence for alteration was found. Additionally, TL dating performed on pottery of one of the studied cases, indicates the importance of leaching effects on absolute dating measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business actions do not take place in isolation. Complementary competencies and capabilities are the most important resources in the exponential knowledge growth. These resources are partially accessed via business partners. A company needs partners and the capability to cooperate, but also the awareness of the competitive tension, when operating in the market with multiple actors. The co-opetition research studies the occurrence and the forms of simultaneous cooperation and competition between companies or their units. Public sector’s governmental and municipal organs have been transformed into companies over the past years. Despite of their non-profit nature, public sector and public companies are adopting business doctrines from private sector towards efficient business operations. This case study aims to show, how co-opetition concept can be observed within public sector companies and in their operations with others, how public companies cooperate but also compete with others and why this happens. This thesis also explicates advantages and disadvantages of the co-opetition phenomenon.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fission yeast Schizosaccharomyces pombe has been an invaluable model system in studying the regulation of the mitotic cell cycle progression, the mechanics of cell division and cell polarity. Furthermore, classical experiments on its sexual reproduction have yielded results pivotal to current understanding of DNA recombination and meiosis. More recent analysis of fission yeast mating has raised interesting questions on extrinsic stimuli response mechanisms, polarized cell growth and cell-cell fusion. To study these topics in detail we have developed a simple protocol for microscopy of the entire sexual lifecycle. The method described here is easily adjusted to study specific mating stages. Briefly, after being grown to exponential phase in a nitrogen-rich medium, cell cultures are shifted to a nitrogen-deprived medium for periods of time suited to the stage of the sexual lifecycle that will be explored. Cells are then mounted on custom, easily built agarose pad chambers for imaging. This approach allows cells to be monitored from the onset of mating to the final formation of spores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A model to solve heat and mass balances during the offdesign load calculations was created. These equations are complex and nonlinear. The main new ideas used in the created offdesign model of a kraft recovery boiler are the use of heat flows as torn iteration variables instead of the current practice of using the mass flows, vectorizing equation solving, thus speeding up the process, using non dimensional variables for solving the multiple heat transfer surface problem and using a new procedure for calculating pressure losses. Recovery boiler heat and mass balances are reduced to vector form. It is shown that these vectorized equations can be solved virtually without iteration. The iteration speed is enhanced by the use of the derived method of calculating multiple heat transfer surfaces simultaneously. To achieve this quick convergence the heat flows were used as the torn iteration parameters. A new method to handle pressure loss calculations with linearization was presented. This method enabled less time to be spent calculating pressure losses. The derived vector representation of the steam generator was used to calculate offdesign operation parameters for a 3000 tds/d example recovery boiler. The model was used to study recovery boiler part load operation and the effect of the black liquor dry solids increase on recovery boiler dimensioning. Heat flows to surface elements for part load calculations can be closely approximated with a previously defined exponent function. The exponential method can be used for the prediction of fouling in kraft recovery boilers. For similar furnaces the firing of 80 % dry solids liquor produces lower hearth heat release rate than the 65 % dry solids liquor if we fire at constant steam flow. The furnace outlet temperatures show that capacity increase with firing rate increase produces higher loadings than capacity increase with dry solids increase. The economizers, boiler banks and furnaces can be dimensioned smaller if we increase the black liquor dry solids content. The main problem with increased black liquor dry solids content is the decrease in the heat available to superheat. Whenever possible the furnace exit temperature should be increased by decreasing the furnace height. The increase in the furnace exit temperature is usually opposed because of fear of increased corrosion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hoy día, todo el mundo tiene un ojo puesto en el Mercado Eléctrico en nuestro país. No existe duda alguna sobre la importancia que tiene el comportamiento de la demanda eléctrica. Una de las peculiaridades de la electricidad que producimos, es que hoy por hoy, no existen aún métodos lo suficientemente efectivos para almacenarla, al menos en grandes cantidades. Por consiguiente, la cantidad demandada y la ofertada/producida deben casar de manera casi perfecta. Debido a estas razones, es bastante interesante tratar de predecir el comportamiento futuro de la demanda, estudiando una posible tendencia y/o estacionalidad. Profundizando más en los datos históricos de las demandas; es relativamente sencillo descubrir la gran influencia que la temperatura ambiente, laboralidad o la actividad económica tienen sobre la respuesta de la demanda. Una vez teniendo todo esto claro, podemos decidir cuál es el mejor método para aplicarlo en este tipo de series temporales. Para este fin, los métodos de análisis más comunes han sido presentados y explicados, poniendo de relieve sus principales características, así como sus aplicaciones. Los métodos en los que se ha centrado este proyecto son en los modelos de alisado y medias móviles. Por último, se ha buscado una relación entre la demanda eléctrica peninsular y el precio final que pagamos por la luz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we obtain sharp asymptotic formulas with error estimates for the Mellin con- volution of functions de ned on (0;1), and use these formulas to characterize the asymptotic behavior of marginal distribution densities of stock price processes in mixed stochastic models. Special examples of mixed models are jump-di usion models and stochastic volatility models with jumps. We apply our general results to the Heston model with double exponential jumps, and make a detailed analysis of the asymptotic behavior of the stock price density, the call option pricing function, and the implied volatility in this model. We also obtain similar results for the Heston model with jumps distributed according to the NIG law.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RX J1826.2-1450/LS 5039 has been recently proposed to be a radio emitting high mass X-ray binary. In this paper, we present an analysis of its X-ray timing and spectroscopic properties using different instruments on board the RXTE satellite. The timing analysis indicates the absence of pulsed or periodic emission on time scales of 0.02-2000 s and 2-200 d, respectively. The source spectrum is well represented by a power-law model, plus a Gaussian component describing a strong iron line at 6.6 keV. Significant emission is seen up to 30 keV, and no exponential cut-off at high energy is required. We also study the radio properties of the system according to the GBI-NASA Monitoring Program. RX J1826.2-1450/LS 5039 continues to display moderate radio variability with a clearly non-thermal spectral index. No strong radio outbursts have been detected after several months.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social, technological, and economic time series are divided by events which are usually assumed to be random, albeit with some hierarchical structure. It is well known that the interevent statistics observed in these contexts differs from the Poissonian profile by being long-tailed distributed with resting and active periods interwoven. Understanding mechanisms generating consistent statistics has therefore become a central issue. The approach we present is taken from the continuous-time random-walk formalism and represents an analytical alternative to models of nontrivial priority that have been recently proposed. Our analysis also goes one step further by looking at the multifractal structure of the interevent times of human decisions. We here analyze the intertransaction time intervals of several financial markets. We observe that empirical data describe a subtle multifractal behavior. Our model explains this structure by taking the pausing-time density in the form of a superstatistics where the integral kernel quantifies the heterogeneous nature of the executed tasks. A stretched exponential kernel provides a multifractal profile valid for a certain limited range. A suggested heuristic analytical profile is capable of covering a broader region.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous results concerning radiative emission under laser irradiation of silicon nanopowder are reinterpreted in terms of thermal emission. A model is developed that considers the particles in the powder as independent, so under vacuum the only dissipation mechanism is thermal radiation. The supralinear dependence observed between the intensity of the emitted radiation and laser power is predicted by the model, as is the exponential quenching when the gas pressure around the sample increases. The analysis allows us to determine the sample temperature. The local heating of the sample has been assessed independently by the position of the transverse optical Raman mode. Finally, it is suggested that the photoluminescence observed in porous silicon and similar materials could, in some cases, be blackbody radiation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En els últims anys el sector de la construcció ha experimentat un creixement exponencial. Aquest creixement ha repercutit sobre molts aspectes: des de la necessitat de tenir més personal a les obres, la implantació d’unes oficines per a poder gestionar la compatibilitat i portar un control sobre les obres fins a la necessitat d’haver de disposar de programes informàtics específics que ajudin a realitzar la feina de la manera més còmode i àgil possible. El projecte que s’ha dut a terme consisteix a cobrir una d’aquestes necessitats, que és la de la gestió dels pressupostos en les diferents obres que els constructors realitzen. Utilitza la base de dades de l’ITEC (institut de Tecnologia de la Construcció de Catalunya) sobre la qual treballen la immensa majoria dels arquitectes quan dissenyen les obres, però també permet entrar les pròpies dades que el constructor vulgui. L’usuari de l’aplicació podrà fer pressupostos per obres de nova construcció, reformes ... agrupant cada una d’elles per capítols. Aquests capítols els podem entendre com les diferents fases a dur a terme, per exemple: la construcció dels fonaments, l’aixecament de les parets o fer la teulada. Dins dels capítols hi trobem les partides, que és un conjunt de materials i hores de feina i maquinària per a dur a terme una part de l’obra, com per exemple seria fer un envà de separació entre habitacions. En aquest cas hi tindríem els diferents materials que necessitaríem, totxanes, morter; les hores de manobre necessàries per aixecar-la, el transport de tot el material fins a l’obra... Tots aquests paràmetres (materials, hores, transport...) s’anomenen articles i van inclosos a dins de les partides. Aquesta aplicació està dissenyada per funcionar en un entorn client/servidor, utilitzant com a servidor un Linux OpenSuse 10.2 i com a clients estacions de treball amb Windows XP, tot i que també podríem utilitzar d’altres versions dels sistemes operatius de Microsoft. L’entorn de desenvolupament utilitzat és el del llenguatge FDS , el qual ja porta integrat un gestor de fitxers que és el que es farà servir.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El software lliure està tenint últimament un pes cada cop més important en les empreses, però encara és el gran desconegut per a molta gent. Des de la seva creació als anys 80 fins ara, hi ha hagut un creixement exponencial de software lliure de gran qualitat, oferint eines per a tot tipus de necessitats, eines ofimàtiques, gestors de correu, sistemes de fitxer, sistemes operatius…. Tot aquest moviment no ha passat desapercebut per a molts usuaris i empreses, que s’han aprofitat d’ell per cobrir les seves necessitats. Pel que fa a les empreses, cada cop n’hi ha més que en petita o gran mesura, utilitzen el software lliure, ja sigui per el seu menor cost d’adquisició, o bé per la seva gran fiabilitat o per que és fàcilment adaptable o per no establir cap lligam tecnològic, en definitiva per tenir més llibertat. En el moment de la creació d’una nova empresa, on es parteix de zero en tota la tecnologia informàtica, és el moment menys costòs d’implementar l’arquitectura informàtica amb software lliure, és quan l’impacte que té sobre l’empresa, usuaris i clients és menor. En les empreses que ja tenen un sistema informàtic, caldrà establir un pla de migració, ja sigui total o parcial. La finalitat d’aquest projecte no és la de dir quin software és millor que l’altre o de dir quin s’ha d’instal•lar, sinó el de donar a conèixer el món del software lliure, mostrar part d’aquest software, fer alguna comparativa de software lliure amb software propietari, donant idees i un conjunt de solucions per a empreses, per què una empresa pugui agafar idees d’implementació d’algunes de les solucions informàtiques exposades o seguir algun dels consells proposats. Actualment ja hi ha moltes empreses que utilitzen software lliure. Algunes només n’utilitzen una petita part en les seves instal•lacions, ja que el fet de que una empresa funcioni al 100% amb software lliure, tot i que n’hi comença ha haver, de moment ho considero una mica arriscat, però que en poc temps, aquest fet serà cada cop més habitual.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUM L'automatització s'utilitza des de fa molts anys, tot i que va començar a agafar la definició que coneixem actualment al voltant dels anys seixanta i setanta, moment en què es comencen a comercialitzar els primers PLC. A partir d'aquí, el seu creixement ha estat exponencial. En aquest sentit, la tecnologia ha anat avançant i ha augmentat els components que la forma, per això a hores d'ara encara no sabem fins on podrà arribar i què aconseguirà. Per a la indústria tot això ha significat l'automatització de processos que fins ara utilitzaven molt mà d'obra, reduint-la dràsticament. Una de les indústries que més s'ha beneficiat de tots aquests avenços ha estat la de l'automoció, concretament les seves grans línies de producció, automatitzades a uns nivells que fins fa poc temps eren impensables. Aquest projecte forma part d'aquesta indústria, no directament per a la construcció de l'automòbil, sinó indirectament, ja que l'empresa per a la qual s'ha fet l'automatització fabrica peces plàstiques per a automòbils. Concretament, unes peces amb uns injerts metàl•lics conductors que es munten a tots els vehicles i s'utilitzen per accionar els neteja vidres dels cotxes. Aquest fet implica que la fabricació i el disseny de la peça sigui curosament vigilat i controlat per al client final, amb uns controls de qualitat extremadament exigents. El funcionament del procés de fabricació es fa a partir d'unes peces de plàstic produïdes per una injectora que es fan passar per unes estacions automatitzades, cada una de les quals fa una acció concreta per aconseguir el muntatge final.