927 resultados para duration, functional delta method, gamma kernel, hazard rate.
Resumo:
Fossil pollen data supplemented by tree macrofossil records were used to reconstruct the vegetation of the Former Soviet Union and Mongolia at 6000 years. Pollen spectra were assigned to biomes using the plant-functional-type method developed by Prentice et al. (1996). Surface pollen data and a modern vegetation map provided a test of the method. This is the first time such a broad-scale vegetation reconstruction for the greater part of northern Eurasia has been attempted with objective techniques. The new results confirm previous regional palaeoenvironmental studies of the mid-Holocene while providing a comprehensive synopsis and firmer conclusions. West of the Ural Mountains temperate deciduous forest extended both northward and southward from its modern range. The northern limits of cool mixed and cool conifer forests were also further north than present. Taiga was reduced in European Russia, but was extended into Yakutia where now there is cold deciduous forest. The northern limit of taiga was extended (as shown by increased Picea pollen percentages, and by tree macrofossil records north of the present-day forest limit) but tundra was still present in north-eastern Siberia. The boundary between forest and steppe in the continental interior did not shift substantially, and dry conditions similar to present existed in western Mongolia and north of the Aral Sea.
Resumo:
Humans’ unique cognitive abilities are usually attributed to a greatly expanded neocortex, which has been described as “the crowning achievement of evolution and the biological substrate of human mental prowess” [1]. The human cerebellum, however, contains four times more neurons than the neocortex [2] and is attracting increasing attention for its wide range of cognitive functions. Using a method for detecting evolutionary rate changes along the branches of phylogenetic trees, we show that the cerebellum underwent rapid size increase throughout the evolution of apes, including humans, expanding significantly faster than predicted by the change in neocortex size. As a result, humans and other apes deviated significantly from the general evolutionary trend for neocortex and cerebellum to change in tandem, having significantly larger cerebella relative to neocortex size than other anthropoid primates. These results suggest that cerebellar specialization was a far more important component of human brain evolution than hitherto recognized and that technical intelligence was likely to have been at least as important as social intelligence in human cognitive evolution. Given the role of the cerebellum in sensory-motor control and in learning complex action sequences, cerebellar specialization is likely to have underpinned the evolution of humans’ advanced technological capacities, which in turn may have been a preadaptation for language.
Resumo:
In this paper we consider the strongly damped wave equation with time-dependent terms u(tt) - Delta u - gamma(t)Delta u(t) + beta(epsilon)(t)u(t) = f(u), in a bounded domain Omega subset of R(n), under some restrictions on beta(epsilon)(t), gamma(t) and growth restrictions on the nonlinear term f. The function beta(epsilon)(t) depends on a parameter epsilon, beta(epsilon)(t) -> 0. We will prove, under suitable assumptions, local and global well-posedness (using the uniform sectorial operators theory), the existence and regularity of pullback attractors {A(epsilon)(t) : t is an element of R}, uniform bounds for these pullback attractors, characterization of these pullback attractors and their upper and lower semicontinuity at epsilon = 0. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved
Resumo:
355 nm light irradiation of fac-[Mn(CO)(3)(phen)(imH)](+) (fac-1) produces the mer-1 isomer and a long lived radical which can be efficiently trapped by electron acceptor molecules. EPR experiments shows that when excited, the manganese(I) complex can be readily oxidized by one-electron process to produce Mn(II) and phen(.-). In the present study, DFT calculations have been used to investigated the photochemical isomerization of the parent Mn(I) complex and to characterize the electronic structures of the long lived radical. The theoretical calculations have been performed on both the fac-1 and mer-1 species as well as on their one electron oxidized species fac-1+ and mer-1+ for the lowest spin configurations (S = 1/2) and fac-6 and mer-6 (S = 5/2) for the highest one to characterize these complexes. In particular, we used a charge decomposition analysis (CDA) and a natural bonding orbital (NBO) to have a better understanding of the chemical bonding in terms of the nature of electronic interactions. The observed variations in geometry and bond energies with an increasing oxidation state in the central metal ion are interpreted in terms of changes in the nature of metal-ligand bonding interactions. The X-ray structure of fac-1 is also described. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Relevant results for (sub-)distribution functions related to parallel systems are discussed. The reverse hazard rate is defined using the product integral. Consequently, the restriction of absolute continuity for the involved distributions can be relaxed. The only restriction is that the sets of discontinuity points of the parallel distributions have to be disjointed. Nonparametric Bayesian estimators of all survival (sub-)distribution functions are derived. Dual to the series systems that use minimum life times as observations, the parallel systems record the maximum life times. Dirichlet multivariate processes forming a class of prior distributions are considered for the nonparametric Bayesian estimation of the component distribution functions, and the system reliability. For illustration, two striking numerical examples are presented.
Resumo:
Parkinson’s disease is a clinical syndrome manifesting with slowness and instability. As it is a progressive disease with varying symptoms, repeated assessments are necessary to determine the outcome of treatment changes in the patient. In the recent past, a computer-based method was developed to rate impairment in spiral drawings. The downside of this method is that it cannot separate the bradykinetic and dyskinetic spiral drawings. This work intends to construct the computer method which can overcome this weakness by using the Hilbert-Huang Transform (HHT) of tangential velocity. The work is done under supervised learning, so a target class is used which is acquired from a neurologist using a web interface. After reducing the dimension of HHT features by using PCA, classification is performed. C4.5 classifier is used to perform the classification. Results of the classification are close to random guessing which shows that the computer method is unsuccessful in assessing the cause of drawing impairment in spirals when evaluated against human ratings. One promising reason is that there is no difference between the two classes of spiral drawings. Displaying patients self ratings along with the spirals in the web application is another possible reason for this, as the neurologist may have relied too much on this in his own ratings.
Resumo:
O objetivo do presente trabalho é o estudo do comportamento, em termos de freqüências naturais de estruturas de torres estaiadas, para diversas situações de serviço. Para isso criou-se uma formulação para a determinação dessas freqüências, utilizando o método da matriz de transferência. O procedimento consiste na discretização da estrutura em elementos de barras, massas discretas, molas e amortecedores viscosos, para a representação da estrutura. Com relação aos cabos da torre estaiada, desenvolveu-se uma expressão que nos fornece a rigidez completa dos mesmos, apoiados nos extremos, com amortecimento viscoso e as propriedades físicas e geométricas uniformes. Além disso, os cabos podem ser inclinados e sujeitos à excitação horizontal harmônica no apoio superior. Nesse caso, considera-se uma deformada parabólica do cabo na posição de equilíbrio estático, e por outro lado, os deslocamentos dinâmicos são considerados pequenos. A rigidez do cabo é válida para um ângulo de inclinação que varia de zero (0) a noventa (90) graus. Esse método é aplicável a microcomputadores devido a pouca memória empregada no processamento de dados. Com esse intuito, foi elaborado um programa para microcomputadores de 16 bits, que possibilita o estudo da estrutura da torre sobre o efeito de flexão pura, torção pura ou acoplamento de ambos. Exemplos numéricos de torres estaiadas e do comportamento da rigidez de cabos foram desenvolvidos para as mais diversas situações de cálculo.
Resumo:
Lucas (1987) has shown a surprising result in business-cycle research: the welfare cost of business cycles are very small. Our paper has several original contributions. First, in computing welfare costs, we propose a novel setup that separates the effects of uncertainty stemming from business-cycle fluctuations and economic-growth variation. Second, we extend the sample from which to compute the moments of consumption: the whole of the literature chose primarily to work with post-WWII data. For this period, actual consumption is already a result of counter-cyclical policies, and is potentially smoother than what it otherwise have been in their absence. So, we employ also pre-WWII data. Third, we take an econometric approach and compute explicitly the asymptotic standard deviation of welfare costs using the Delta Method. Estimates of welfare costs show major differences for the pre-WWII and the post-WWII era. They can reach up to 15 times for reasonable parameter values -β=0.985, and ∅=5. For example, in the pre-WWII period (1901-1941), welfare cost estimates are 0.31% of consumption if we consider only permanent shocks and 0.61% of consumption if we consider only transitory shocks. In comparison, the post-WWII era is much quieter: welfare costs of economic growth are 0.11% and welfare costs of business cycles are 0.037% - the latter being very close to the estimate in Lucas (0.040%). Estimates of marginal welfare costs are roughly twice the size of the total welfare costs. For the pre-WWII era, marginal welfare costs of economic-growth and business- cycle fluctuations are respectively 0.63% and 1.17% of per-capita consumption. The same figures for the post-WWII era are, respectively, 0.21% and 0.07% of per-capita consumption.
Resumo:
Lucas(1987) has shown a surprising result in business-cycle research: the welfare cost of business cycles are very small. Our paper has several original contributions. First, in computing welfare costs, we propose a novel setup that separates the effects of uncertainty stemming from business-cycle uctuations and economic-growth variation. Second, we extend the sample from which to compute the moments of consumption: the whole of the literature chose primarily to work with post-WWII data. For this period, actual consumption is already a result of counter-cyclical policies, and is potentially smoother than what it otherwise have been in their absence. So, we employ also pre-WWII data. Third, we take an econometric approach and compute explicitly the asymptotic standard deviation of welfare costs using the Delta Method. Estimates of welfare costs show major diferences for the pre-WWII and the post-WWII era. They can reach up to 15 times for reasonable parameter values = 0:985, and = 5. For example, in the pre-WWII period (1901-1941), welfare cost estimates are 0.31% of consumption if we consider only permanent shocks and 0.61% of consumption if we consider only transitory shocks. In comparison, the post-WWII era is much quieter: welfare costs of economic growth are 0.11% and welfare costs of business cycles are 0.037% the latter being very close to the estimate in Lucas (0.040%). Estimates of marginal welfare costs are roughly twice the size of the total welfare costs. For the pre-WWII era, marginal welfare costs of economic-growth and business-cycle uctuations are respectively 0.63% and 1.17% of per-capita consumption. The same gures for the post-WWII era are, respectively, 0.21% and 0.07% of per-capita consumption.
Resumo:
This paper develops a methodology for testing the term structure of volatility forecasts derived from stochastic volatility models, and implements it to analyze models of S&P500 index volatility. U sing measurements of the ability of volatility models to hedge and value term structure dependent option positions, we fmd that hedging tests support the Black-Scholes delta and gamma hedges, but not the simple vega hedge when there is no model of the term structure of volatility. With various models, it is difficult to improve on a simple gamma hedge assuming constant volatility. Ofthe volatility models, the GARCH components estimate of term structure is preferred. Valuation tests indicate that all the models contain term structure information not incorporated in market prices.
Resumo:
Atualmente a preocupação ambiental está fazendo com que as empresas busquem diminuir os impactos ambientais por elas causados, ao mesmo tempo em que melhoram a qualidade do produto e processos de fabricação. Logo, muitas pesquisas estão sendo desenvolvidas na área de usinagem para se analisar o real dano ao meio ambiente quando usados diferentes métodos de lubri-refrigeração. Este trabalho teve como objetivo analisar a qualidade da peça produzida e o desgaste do ferramental de corte de uma retificadora plana ao se usinar cerâmica de alumina com dois métodos distintos de aplicação de fluido de corte: método convencional com vazão de 458,3 mL/h e o método da mínima quantidade de lubrificação (MQL) com 100 mL/h. A partir dos resultados obtidos pode-se constatar que para os mesmos parâmetros de usinagem a técnica do MQL utilizou uma quantidade muito menor de fluido e garantiu bons resultados de desgaste diametral do rebolo. No entanto, a qualidade da peça foi bem pior para o método do MQL em relação a técnica de refrigeração convencional. Estes resultados mostraram que se utilizando formas alternativas de lubrificação para reduzir o uso do fluido de corte, são possíveis dependendo de quais fatores são mais importantes para o processo que se deseja. Nesse sentido, se o método do MQL fosse adotado pelas empresas dependentes da retificação, certamente iria trazer, de um lado, benefícios quanto a problemas de descarte e reciclagem de fluido de corte, mas por outro lado, levaria a uma menor qualidade superficial das peças.
Resumo:
OBJETIVO: Analisar, do ponto de vista fonético, por meio de medidas acústicas e perceptivas, os aspectos prosódicos temporais presentes na leitura em voz alta de escolares com e sem dislexia, a fim de identificar diferenças de desempenho entre os dois tipos de leitores que possam apontar para características peculiares da dislexia. MÉTODOS: Gravação da leitura de um texto por 40 escolares (entre nove e 14 anos, cursando da 3ª à 5ª série), sendo 10 disléxicos (grupo clínico) e 30 escolares sem queixas de alterações de aprendizagem (grupo não-clínico). Os dados foram analisados perceptivamente e acusticamente, utilizando-se o programa WinPitch. As seguintes medidas foram realizadas: duração e localização das pausas, tempo total de elocução, taxa de elocução, tempo de articulação e taxa de articulação. RESULTADOS: em comparação com o grupo não-clínico, o grupo clínico apresentou maior número de pausas e pausas mais longas; os valores obtidos para as taxas de elocução e de articulação indicaram respectivamente menor velocidade de leitura e uma lentidão na produção de cada gesto articulatório. CONCLUSÃO: As dificuldades identificadas no processamento da leitura pelas crianças com dislexia dificultam a organização prosódica na leitura de um texto.
Resumo:
Objective: Transitional implants are indicated for cases in which immediate loading is counter-indicated because a healing period is necessary for osseointegration of the definitive implants. These provisional implants were developed to support an implant-supported fixed prosthesis or overdenture to provide retention, stability, and support. The aim of this article was to conduct a literature review on transitional implants to highlight the characteristics of the transitional implants and their advantages, indications, and contraindications, including the level of osseointegration of such implants according to the functional period. Method and Materials: The present literature review was based on the OldMedline and Medline databases from 1999 to 2010 using the key words "transitional implants" and "temporary implants." Fourteen articles were found: 11 clinical studies or techniques and three histologic and histomorphometric studies. Results: The transitional immediate prostheses were worn by completely and partially edentulous patients. Advantages of transitional implants include complete denture retention, stability, and support; maintenance of chewing, phonetics, and patient comfort; protection of bone grafts; vertical stop during healing period; easy and fast surgical and prosthetic procedures; lower cost in comparison to the definitive implant; and reestablishment of esthetics. The success of transitional implants as conservative treatment for conventional immediate loading is a reality if correctly indicated. Conclusion: Transitional implants are a provisional treatment alternative for completely and partially edentulous patients. However, additional studies are required to evaluate the level of remodeling and repair of the transitional implants under loading. (Quintessence Int 2011; 42: 19-24)
Resumo:
As alterações que envolvem as globinas devem-se a modificações em genes responsáveis pela seqüência e estrutura das cadeias polipeptídicas, bem como aos genes reguladores da síntese destas cadeias. Hemoglobinas variantes apresentam estrutura química diferente da hemoglobina normal correspondente, resultante de mutações em uma ou mais bases nitrogenadas, ocasionando a troca de aminoácidos nas globinas alfa, beta, delta ou gama. A hemoglobina N-Baltimore é uma variante de globina beta, com substituição da lisina, na posição 95, por ácido glutâmico, apresentando mobilidade eletroforética mais rápida que a hemoglobina A em pH alcalino. Nas análises eletroforéticas em pH alcalino realizadas em doadores de sangue do Hemocentro de São José do Rio Preto (SP) identificamos a presença de portador de hemoglobina rápida em heterozigose, posteriormente confirmada por focalização isoelétrica e cromatografia líquida de alta pressão (HPLC). Os estudos de hemoglobinas anormais em doadores de sangue permitem a identificação de variantes raras e possibilitam o aconselhamento genético adequado a cada caso com estudo familial.