918 resultados para Continuous random network


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Field emission properties of single-walled carbon nanotubes (SWCNTs), which were prepared through alcohol catalytic chemical vapor deposition for 10-60s, were characterized in a diode configuration. Protrusive bundles at the top surface of samples act selectively as emission sites. The number of emission sites was controlled by emitter morphologies combined with texturing of Si substrates. SWCNTs grown on a textured Si substrate exhibited a turn-on field as low as 2.4 V/μm at a field emission current density of 1 μA/cm 2. Uniform spatial luminescence (0.5 cm2) from the rear surface of the anode was revealed for SWCNTs prepared on the textured Si substrate. Deterioration of field emission properties through repetitive measurements was reduced for the textured samples in comparison with vertically aligned SWCNTs and a random network of SWCNTs prepared on flat Si substrates. Emitter morphology resulting in improved field emission properties is a crucial factor for the fabrication of SWCNT-electron sources. Morphologically controlled SWCNTs with promising emitter performance are expected to be practical electron sources. © 2008 The Japan Society of Applied Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The low frequency vibrational spectrum of cluster beam deposited carbon films was studied by Brillouin light scattering. In thin films the values of both bulk modulus and shear modulus has been estimated from the shifts of surface phonon peaks. The values found indicate a mainly sp2 coordinated random network with low density. In thick films a bulk longitudinal phonon peak was detected in a spectral range compatible with the value of the index of refraction and of the elastic constants of thin films. High surface roughness, combined with a rather strong bulk central peak, prevented the observation of surface phonon features. © 1998 Elsevier Science Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymeric fibrous scaffolds have been considered as replacements for load-bearing soft tissues, because of their ability to mimic the microstructure of natural tissues. Poor toughness of fibrous materials results in failure, which is an issue of importance to both engineering and medical practice. The toughness of fibrous materials depends on the ability of the microstructure to develop toughening mechanisms. However, such toughening mechanisms are still not well understood, because the detailed evolution at the microscopic level is difficult to visualize. A novel and simple method was developed, namely, a sample-taping technique, to examine the detailed failure mechanisms of fibrous microstructures. This technique was compared with in situ fracture testing by scanning electron microscopy. Examination of three types of fibrous networks showed that two different failure modes occurred in fibrous scaffolds. For brittle cracking in gelatin electrospun scaffolds, the random network morphology around the crack tip remained during crack propagation. For ductile failure in polycaprolactone electrospun scaffolds and nonwoven fabrics, the random network deformed via fiber rearrangement, and a large number of fiber bundles formed across the region in front of the notch tip. These fiber bundles not only accommodated mechanical strain, but also resisted crack propagation and thus toughened the fibrous scaffolds. Such understanding provides insight for the production of fibrous materials with enhanced toughness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymeric fibrous scaffolds have been considered as replacements for load-bearing soft tissues, because of their ability to mimic the microstructure of natural tissues. Poor toughness of fibrous materials results in failure, which is an issue of importance to both engineering and medical practice. The toughness of fibrous materials depends on the ability of the microstructure to develop toughening mechanisms. However, such toughening mechanisms are still not well understood, because the detailed evolution at the microscopic level is difficult to visualize. A novel and simple method was developed, namely, a sample-taping technique, to examine the detailed failure mechanisms of fibrous microstructures. This technique was compared with in situ fracture testing by scanning electron microscopy. Examination of three types of fibrous networks showed that two different failure modes occurred in fibrous scaffolds. For brittle cracking in gelatin electrospun scaffolds, the random network morphology around the crack tip remained during crack propagation. For ductile failure in polycaprolactone electrospun scaffolds and nonwoven fabrics, the random network deformed via fiber rearrangement, and a large number of fiber bundles formed across the region in front of the notch tip. These fiber bundles not only accommodated mechanical strain, but also resisted crack propagation and thus toughened the fibrous scaffolds. Such understanding provides insight for the production of fibrous materials with enhanced toughness. © 2013 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

针对自动最复重传(ARQ)机制在无线广播系统中吞吐量性能不佳的缺陷,提出一种基于随机网络编码的广播重传方案RNC-ARQ。对于广播节点,采用随机线性码对所有丢失包进行编码组合重传。对于接收节点,当接收的编码包累积到一定数量后可通过解码操作恢复出原始数据。该方案可有效减少重传次数,改善无线广播的吞吐量性能。基于Gilbert-Elliott模型描述的突发错误信道,建立了信道状态和节点接收处理流程合并的多状态马尔可夫模型,并以此为基础推导了RNC-ARQ方案的TQ吐量闭合解。最后,使用NS-2模拟器评估RNC-ARQ方案的性能,结果表明在突发差错信道下,基于随机网络编码重传方案的吞吐量优于传统的选择重传ARQ方案和基于异或编码的重传方案。

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hierarchical Fe/ZSM-5 zeolites were synthesized with a diquaternary ammonium surfactant containing a hydrophobic tail and extensively characterized by XRD, Ar porosimetry, TEM, DRUV-Vis, and UV-Raman spectroscopy. Their catalytic activities in catalytic decomposition of NO and the oxidation of benzene to phenol with NO as the oxidant were also determined. The hierarchical zeolites consist of thin sheets limited in growth in the b-direction (along the straight channels of the MFI network) and exhibit similar high hydrothermal stability as a reference Fe/ZSM-5 zeolite. Spectroscopic and catalytic investigations point to subtle differences in the extent of Fe agglomeration with the sheet-like zeolites having a higher proportion of isolated Fe centers than the reference zeolite. As a consequence, these zeolites have a somewhat lower activity in catalytic NO decomposition (catalyzed by oligomeric Fe), but display higher activity in benzene oxidation (catalyzed by monomeric Fe). The sheet-like zeolites deactivate much slower than bulk Fe/ZSM-5, which is attributed to the much lower probability of secondary reactions of phenol in the short straight channels of the sheets. The deactivation rate decreases with decreasing Fe content of the Fe/ZSM-5 nanosheets. It is found that carbonaceous materials are mainly deposited in the mesopores between the nanosheets and much less so in the micropores. This contrasts the strong decrease in the micropore volume of bulk Fe/ZSM-5 due to rapid clogging of the continuous micropore network. The formation of coke deposits is limited in the nanosheet zeolites because of the short molecular trafficking distances. It is argued that at high Si/Fe content, coke deposits mainly form on the external surface of the nanosheets. © 2012 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse présente des méthodes de traitement de données de comptage en particulier et des données discrètes en général. Il s'inscrit dans le cadre d'un projet stratégique du CRNSG, nommé CC-Bio, dont l'objectif est d'évaluer l'impact des changements climatiques sur la répartition des espèces animales et végétales. Après une brève introduction aux notions de biogéographie et aux modèles linéaires mixtes généralisés aux chapitres 1 et 2 respectivement, ma thèse s'articulera autour de trois idées majeures. Premièrement, nous introduisons au chapitre 3 une nouvelle forme de distribution dont les composantes ont pour distributions marginales des lois de Poisson ou des lois de Skellam. Cette nouvelle spécification permet d'incorporer de l'information pertinente sur la nature des corrélations entre toutes les composantes. De plus, nous présentons certaines propriétés de ladite distribution. Contrairement à la distribution multidimensionnelle de Poisson qu'elle généralise, celle-ci permet de traiter les variables avec des corrélations positives et/ou négatives. Une simulation permet d'illustrer les méthodes d'estimation dans le cas bidimensionnel. Les résultats obtenus par les méthodes bayésiennes par les chaînes de Markov par Monte Carlo (CMMC) indiquent un biais relatif assez faible de moins de 5% pour les coefficients de régression des moyennes contrairement à ceux du terme de covariance qui semblent un peu plus volatils. Deuxièmement, le chapitre 4 présente une extension de la régression multidimensionnelle de Poisson avec des effets aléatoires ayant une densité gamma. En effet, conscients du fait que les données d'abondance des espèces présentent une forte dispersion, ce qui rendrait fallacieux les estimateurs et écarts types obtenus, nous privilégions une approche basée sur l'intégration par Monte Carlo grâce à l'échantillonnage préférentiel. L'approche demeure la même qu'au chapitre précédent, c'est-à-dire que l'idée est de simuler des variables latentes indépendantes et de se retrouver dans le cadre d'un modèle linéaire mixte généralisé (GLMM) conventionnel avec des effets aléatoires de densité gamma. Même si l'hypothèse d'une connaissance a priori des paramètres de dispersion semble trop forte, une analyse de sensibilité basée sur la qualité de l'ajustement permet de démontrer la robustesse de notre méthode. Troisièmement, dans le dernier chapitre, nous nous intéressons à la définition et à la construction d'une mesure de concordance donc de corrélation pour les données augmentées en zéro par la modélisation de copules gaussiennes. Contrairement au tau de Kendall dont les valeurs se situent dans un intervalle dont les bornes varient selon la fréquence d'observations d'égalité entre les paires, cette mesure a pour avantage de prendre ses valeurs sur (-1;1). Initialement introduite pour modéliser les corrélations entre des variables continues, son extension au cas discret implique certaines restrictions. En effet, la nouvelle mesure pourrait être interprétée comme la corrélation entre les variables aléatoires continues dont la discrétisation constitue nos observations discrètes non négatives. Deux méthodes d'estimation des modèles augmentés en zéro seront présentées dans les contextes fréquentiste et bayésien basées respectivement sur le maximum de vraisemblance et l'intégration de Gauss-Hermite. Enfin, une étude de simulation permet de montrer la robustesse et les limites de notre approche.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[1] In many practical situations where spatial rainfall estimates are needed, rainfall occurs as a spatially intermittent phenomenon. An efficient geostatistical method for rainfall estimation in the case of intermittency has previously been published and comprises the estimation of two independent components: a binary random function for modeling the intermittency and a continuous random function that models the rainfall inside the rainy areas. The final rainfall estimates are obtained as the product of the estimates of these two random functions. However the published approach does not contain a method for estimation of uncertainties. The contribution of this paper is the presentation of the indicator maximum likelihood estimator from which the local conditional distribution of the rainfall value at any location may be derived using an ensemble approach. From the conditional distribution, representations of uncertainty such as the estimation variance and confidence intervals can be obtained. An approximation to the variance can be calculated more simply by assuming rainfall intensity is independent of location within the rainy area. The methodology has been validated using simulated and real rainfall data sets. The results of these case studies show good agreement between predicted uncertainties and measured errors obtained from the validation data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In survival analysis frailty is often used to model heterogeneity between individuals or correlation within clusters. Typically frailty is taken to be a continuous random effect, yielding a continuous mixture distribution for survival times. A Bayesian analysis of a correlated frailty model is discussed in the context of inverse Gaussian frailty. An MCMC approach is adopted and the deviance information criterion is used to compare models. As an illustration of the approach a bivariate data set of corneal graft survival times is analysed. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

GPS active networks are more and more used in geodetic surveying and scientific experiments, as water vapor monitoring in the atmosphere and lithosphere plate movement. Among the methods of GPS positioning, Precise Point Positioning (PPP) has provided very good results. A characteristic of PPP is related to the modeling and/or estimation of the errors involved in this method. The accuracy obtained for the coordinates can reach few millimeters. Seasonal effects can affect such accuracy if they are not consistent treated during the data processing. Coordinates time series analyses have been realized using Fourier or Harmonics spectral analyses, wavelets, least squares estimation among others. An approach is presented in this paper aiming to investigate the seasonal effects included in the stations coordinates time series. Experiments were carried out using data from stations Manaus (NAUS) and Fortaleza (BRFT) which belong to the Brazilian Continuous GPS Network (RBMC). The coordinates of these stations were estimated daily using PPP and were analyzed through wavelets for identification of the periods of the seasonal effects (annual and semi-annual) in each time series. These effects were removed by means of a filtering process applied in the series via the least squares adjustment (LSQ) of a periodic function. The results showed that the combination of these two mathematical tools, wavelets and LSQ, is an interesting and efficient technique for removal of seasonal effects in time series.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

GPS active networks are more and more used in geodetic surveying and scientific experiments, as water vapor monitoring in the atmosphere and lithosphere plate movement. Among the methods of GPS positioning, Precise Point Positioning (PPP) has provided very good results. A characteristic of PPP is related to the modeling and / or estimation of the errors involved in this method. The accuracy obtained for the coordinates can reach few millimeters. Seasonal effects can affect such accuracy if they are not consistent treated during the data processing. Coordinates time series analyses have been realized using Fourier or Harmonics spectral analyses, wavelets, least squares estimation among others. An approach is presented in this paper aiming to investigate the seasonal effects included in the stations coordinates time series. Experiments were carried out using data from stations Manaus (NAUS) and Fortaleza (BRFT) which belong to the Brazilian Continuous GPS Network (RBMC). The coordinates of these stations were estimated daily using PPP and were analyzed through wavelets for identification of the periods of the seasonal effects (annual and semi-annual) in each time series. These effects were removed by means of a filtering process applied in the series via the least squares adjustment (LSQ) of a periodic function. The results showed that the combination of these two mathematical tools, wavelets and LSQ, is an interesting and efficient technique for removal of seasonal effects in time series.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Biological processes are complex and possess emergent properties that can not be explained or predict by reductionism methods. To overcome the limitations of reductionism, researchers have been used a group of methods known as systems biology, a new interdisciplinary eld of study aiming to understand the non-linear interactions among components embedded in biological processes. These interactions can be represented by a mathematical object called graph or network, where the elements are represented by nodes and the interactions by edges that link pair of nodes. The networks can be classi- ed according to their topologies: if node degrees follow a Poisson distribution in a given network, i.e. most nodes have approximately the same number of links, this is a random network; if node degrees follow a power-law distribution in a given network, i.e. small number of high-degree nodes and high number of low-degree nodes, this is a scale-free network. Moreover, networks can be classi ed as hierarchical or non-hierarchical. In this study, we analised Escherichia coli and Saccharomyces cerevisiae integrated molecular networks, which have protein-protein interaction, metabolic and transcriptional regulation interactions. By using computational methods, such as MathematicaR , and data collected from public databases, we calculated four topological parameters: the degree distribution P(k), the clustering coe cient C(k), the closeness centrality CC(k) and the betweenness centrality CB(k). P(k) is a function that calculates the total number of nodes with k degree connection and is used to classify the network as random or scale-free. C(k) shows if a network is hierarchical, i.e. if the clusterization coe cient depends on node degree. CC(k) is an indicator of how much a node it is in the lesse way among others some nodes of the network and the CB(k) is a pointer of how a particular node is among several ...(Complete abstract click electronic access below)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In epidemiology, the basic reproduction number R-0 is usually defined as the average number of new infections caused by a single infective individual introduced into a completely susceptible population. According to this definition. R-0 is related to the initial stage of the spreading of a contagious disease. However, from epidemiological models based on ordinary differential equations (ODE), R-0 is commonly derived from a linear stability analysis and interpreted as a bifurcation parameter: typically, when R-0 >1, the contagious disease tends to persist in the population because the endemic stationary solution is asymptotically stable: when R-0 <1, the corresponding pathogen tends to naturally disappear because the disease-free stationary solution is asymptotically stable. Here we intend to answer the following question: Do these two different approaches for calculating R-0 give the same numerical values? In other words, is the number of secondary infections caused by a unique sick individual equal to the threshold obtained from stability analysis of steady states of ODE? For finding the answer, we use a susceptibleinfective-recovered (SIR) model described in terms of ODE and also in terms of a probabilistic cellular automaton (PCA), where each individual (corresponding to a cell of the PCA lattice) is connected to others by a random network favoring local contacts. The values of R-0 obtained from both approaches are compared, showing good agreement. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The generalized failure rate of a continuous random variable has demonstrable importance in operations management. If the valuation distribution of a product has an increasing generalized failure rate (that is, the distribution is IGFR), then the associated revenue function is unimodal, and when the generalized failure rate is strictly increasing, the global maximum is uniquely specified. The assumption that the distribution is IGFR is thus useful and frequently held in recent pricing, revenue, and supply chain management literature. This note contributes to the IGFR literature in several ways. First, it investigates the prevalence of the IGFR property for the left and right truncations of valuation distributions. Second, we extend the IGFR notion to discrete distributions and contrast it with the continuous distribution case. The note also addresses two errors in the previous IGFR literature. Finally, for future reference, we analyze all common (continuous and discrete) distributions for the prevalence of the IGFR property, and derive and tabulate their generalized failure rates.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As empresas que almejam garantir e melhorar sua posição dentro de em um mercado cada vez mais competitivo precisam estar sempre atualizadas e em constante evolução. Na busca contínua por essa evolução, investem em projetos de Pesquisa & Desenvolvimento (P&D) e em seu capital humano para promover a criatividade e a inovação organizacional. As pessoas têm papel fundamental no desenvolvimento da inovação, mas para que isso possa florescer de forma constante é preciso comprometimento e criatividade para a geração de ideias. Criatividade é pensar o novo; inovação é fazer acontecer. Porém, encontrar pessoas com essas qualidades nem sempre é tarefa fácil e muitas vezes é preciso estimular essas habilidades e características para que se tornem efetivamente criativas. Os cursos de graduação podem ser uma importante ferramenta para trabalhar esses aspectos, características e habilidades, usando métodos e práticas de ensino que auxiliem no desenvolvimento da criatividade, pois o ambiente ensino-aprendizagem pesa significativamente na formação das pessoas. O objetivo deste estudo é de identificar quais fatores têm maior influência sobre o desenvolvimento da criatividade em um curso de graduação em administração, analisando a influência das práticas pedagógicas dos docentes e as barreiras internas dos discentes. O referencial teórico se baseia principalmente nos trabalhos de Alencar, Fleith, Torrance e Wechsler. A pesquisa transversal de abordagem quantitativa teve como público-alvo os alunos do curso de Administração de uma universidade confessional da Grande São Paulo, que responderam 465 questionários compostos de três escalas. Para as práticas docentes foi adaptada a escala de Práticas Docentes em relação à Criatividade. Para as barreiras internas foi adaptada a escala de Barreiras da Criatividade Pessoal. Para a análise da percepção do desenvolvimento da criatividade foi construída e validada uma escala baseada no referencial de características de uma pessoa criativa. As análises estatísticas descritivas e fatoriais exploratórias foram realizadas no software Statistical Package for the Social Sciences (SPSS), enquanto as análises fatoriais confirmatórias e a mensuração da influência das práticas pedagógicas e das barreiras internas sobre a percepção do desenvolvimento da criatividade foram realizadas por modelagem de equação estrutural utilizando o algoritmo Partial Least Squares (PLS), no software Smart PLS 2.0. Os resultados apontaram que as práticas pedagógicas e as barreiras internas dos discentes explicam 40% da percepção de desenvolvimento da criatividade, sendo as práticas pedagógicas que exercem maior influencia. A pesquisa também apontou que o tipo de temática e o período em que o aluno está cursando não têm influência sobre nenhum dos três construtos, somente o professor influencia as práticas pedagógicas.