993 resultados para Single-file Diffusion
Resumo:
The species abundance distribution (SAD) has been a central focus of community ecology for over fifty years, and is currently the subject of widespread renewed interest. The gambin model has recently been proposed as a model that provides a superior fit to commonly preferred SAD models. It has also been argued that the model's single parameter (α) presents a potentially informative ecological diversity metric, because it summarises the shape of the SAD in a single number. Despite this potential, few empirical tests of the model have been undertaken, perhaps because the necessary methods and software for fitting the model have not existed. Here, we derive a maximum likelihood method to fit the model, and use it to undertake a comprehensive comparative analysis of the fit of the gambin model. The functions and computational code to fit the model are incorporated in a newly developed free-to-download R package (gambin). We test the gambin model using a variety of datasets and compare the fit of the gambin model to fits obtained using the Poisson lognormal, logseries and zero-sum multinomial distributions. We found that gambin almost universally provided a better fit to the data and that the fit was consistent for a variety of sample grain sizes. We demonstrate how α can be used to differentiate intelligibly between community structures of Azorean arthropods sampled in different land use types. We conclude that gambin presents a flexible model capable of fitting a wide variety of observed SAD data, while providing a useful index of SAD form in its single fitted parameter. As such, gambin has wide potential applicability in the study of SADs, and ecology more generally.
Resumo:
Agências Financiadoras: FCT e MIUR
Resumo:
The electroactivity of butylate (BTL) is studied by cyclic voltammetry (CV) and square wave voltammetry (SWV) at a glassy carbon electrode (GCE) and a hanging mercury drop electrode (HMDE). Britton–Robinson buffer solutions of pH 1.9–11.5 are used as supporting electrolyte. CV voltammograms using GCE show a single anodic peak regarding the oxidation of BTL at +1.7V versus AgCl/ Ag, an irreversible process controlled by diffusion. Using a HMDE, a single cathodic peak is observed, at 1.0V versus AgCl/Ag. The reduction of BTL is irreversible and controlled by adsorption. Mechanism proposals are presented for these redox transformations. Optimisation is carried out univaryingly. Linearity ranges were 0.10–0.50 mmol L-1 and 2.0–9.0 µmolL-1 for anodic and cathodic peaks, respectively. The proposed method is applied to the determination of BTL in waters. Analytical results compare well with those obtained by an HPLC method.
Resumo:
Agências financiadoras: FCT - PEstOE/FIS/UI0618/2011; PTDC/FIS/098254/2008 ERC-PATCHYCOLLOIDS e MIUR-PRIN
Resumo:
OBJECTIVE: To estimate the validity of three single questions used to assess self-reported hearing loss as compared to pure-tone audiometry in an adult population. METHODS: A validity study was performed with a random sub-sample of 188 subjects aged 30 to 65 years, drawn from the fourth wave of a population-based cohort study carried out in Salvador, Northeastern Brazil. Data were collected in household visits using questionnaires. Three questions were used to separately assess self-reported hearing loss: Q1, "Do you feel you have a hearing loss?"; Q2, "In general, would you say your hearing is 'excellent,' 'very good,' 'good,' 'fair,' 'poor'?"; Q3, "Currently, do you think you can hear 'the same as before', 'less than before only in the right ear', 'less than before only in the left ear', 'less than before in both ears'?". Measures of accuracy were estimated through seven measures including Youden index. Responses to each question were compared to the results of pure-tone audiometry to estimate accuracy measures. RESULTS: The estimated sensitivity and specificity were 79.6%, 77.4% for Q1; 66.9%, 85.1% for Q2; and 81.5%, 76.4% for Q3, respectively. The Youden index ranged from 51.9% (Q2) to 57.0% (Q1) and 57.9% (Q3). CONCLUSIONS: Each of all three questions provides responses accurate enough to support their use to assess self-reported hearing loss in epidemiological studies with adult populations when pure-tone audiometry is not feasible.
Resumo:
Doutoramento em Gestão
Resumo:
This paper studies the effects of the diffusion of a General Purpose Technology (GPT) that spreads first within the developed North country of its origin, and then to a developing South country. In the developed general equilibrium growth model, each final good can be produced by one of two technologies. Each technology is characterized by a specific labor complemented by a specific set of intermediate goods, which are enhanced periodically by Schumpeterian R&D activities. When quality reaches a threshold level, a GPT arises in one of the technologies and spreads first to the other technology within the North. Then, it propagates to the South, following a similar sequence. Since diffusion is not even, neither intra- nor inter-country, the GPT produces successive changes in the direction of technological knowledge and in inter- and intra-country wage inequality. Through this mechanism the different observed paths of wage inequality can be accommodated.
Resumo:
Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.
Resumo:
Transdermal biotechnologies are an ever increasing field of interest, due to the medical and pharmaceutical applications that they underlie. There are several mathematical models at use that permit a more inclusive vision of pure experimental data and even allow practical extrapolation for new dermal diffusion methodologies. However, they grasp a complex variety of theories and assumptions that allocate their use for specific situations. Models based on Fick's First Law found better use in contexts where scaled particle theory Models would be extensive in time-span but the reciprocal is also true, as context of transdermal diffusion of particular active compounds changes. This article reviews extensively the various theoretical methodologies for studying dermic diffusion in the rate limiting dermic barrier, the stratum corneum, and systematizes its characteristics, their proper context of application, advantages and limitations, as well as future perspectives.
Resumo:
In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.
Resumo:
Graphics processors were originally developed for rendering graphics but have recently evolved towards being an architecture for general-purpose computations. They are also expected to become important parts of embedded systems hardware -- not just for graphics. However, this necessitates the development of appropriate timing analysis techniques which would be required because techniques developed for CPU scheduling are not applicable. The reason is that we are not interested in how long it takes for any given GPU thread to complete, but rather how long it takes for all of them to complete. We therefore develop a simple method for finding an upper bound on the makespan of a group of GPU threads executing the same program and competing for the resources of a single streaming multiprocessor (whose architecture is based on NVIDIA Fermi, with some simplifying assunptions). We then build upon this method to formulate the derivation of the exact worst-case makespan (and corresponding schedule) as an optimization problem. Addressing the issue of tractability, we also present a technique for efficiently computing a safe estimate of the worstcase makespan with minimal pessimism, which may be used when finding an exact value would take too long.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
WiDom is a previously proposed prioritized medium access control protocol for wireless channels. We present a modification to this protocol in order to improve its reliability. This modification has similarities with cooperative relaying schemes, but, in our protocol, all nodes can relay a carrier wave. The preliminary evaluation shows that, under transmission errors, a significant reduction on the number of failed tournaments can be achieved.
Resumo:
This paper presents a single precision floating point arithmetic unit with support for multiplication, addition, fused multiply-add, reciprocal, square-root and inverse squareroot with high-performance and low resource usage. The design uses a piecewise 2nd order polynomial approximation to implement reciprocal, square-root and inverse square-root. The unit can be configured with any number of operations and is capable to calculate any function with a throughput of one operation per cycle. The floatingpoint multiplier of the unit is also used to implement the polynomial approximation and the fused multiply-add operation. We have compared our implementation with other state-of-the-art proposals, including the Xilinx Core-Gen operators, and conclude that the approach has a high relative performance/area efficiency. © 2014 Technical University of Munich (TUM).
Resumo:
Joining of components with structural adhesives is currently one of the most widespread techniques for advanced structures (e.g., aerospace or aeronautical). Adhesive bonding does not involve drilling operations and it distributes the load over a larger area than mechanical joints. However, peak stresses tend to develop near the overlap edges because of differential straining of the adherends and load asymmetry. As a result, premature failures can be expected, especially for brittle adhesives. Moreover, bonded joints are very sensitive to the surface treatment of the material, service temperature, humidity and ageing. To surpass these limitations, the combination of adhesive bonding with spot-welding is a choice to be considered, adding a few advantages like superior static strength and stiffness, higher peeling and fatigue strength and easier fabrication, as fixtures during the adhesive curing are not needed. The experimental and numerical study presented here evaluates hybrid spot-welded/bonded single-lap joints in comparison with the purely spot-welded and bonded equivalents. A parametric study on the overlap length (LO) allowed achieving different strength advantages, up to 58% compared to spot-welded joints and 24% over bonded joints. The Finite Element Method (FEM) and Cohesive Zone Models (CZM) for damage growth were also tested in Abaqus® to evaluate this technique for strength prediction, showing accurate estimations for all kinds of joints.