915 resultados para pseudo-random permutation
Resumo:
Most cellular solids are random materials, while practically all theoretical structure-property results are for periodic models. To be able to generate theoretical results for random models, the finite element method (FEM) was used to study the elastic properties of solids with a closed-cell cellular structure. We have computed the density (rho) and microstructure dependence of the Young's modulus (E) and Poisson's ratio (PR) for several different isotropic random models based on Voronoi tessellations and level-cut Gaussian random fields. The effect of partially open cells is also considered. The results, which are best described by a power law E infinity rho (n) (1<n<2), show the influence of randomness and isotropy on the properties of closed-cell cellular materials, and are found to be in good agreement with experimental data. (C) 2001 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
The known permutation behaviour of the Dickson polynomials of the second kind in characteristic 3 is expanded and simplified. (C) 2002 Elsevier Science (USA).
Resumo:
A new class of bilinear permutation polynomials was recently identified. In this note we determine the class of permutation polynomials which represents the functional inverse of the bilinear class.
Resumo:
A finite-element method is used to study the elastic properties of random three-dimensional porous materials with highly interconnected pores. We show that Young's modulus, E, is practically independent of Poisson's ratio of the solid phase, nu(s), over the entire solid fraction range, and Poisson's ratio, nu, becomes independent of nu(s) as the percolation threshold is approached. We represent this behaviour of nu in a flow diagram. This interesting but approximate behaviour is very similar to the exactly known behaviour in two-dimensional porous materials. In addition, the behaviour of nu versus nu(s) appears to imply that information in the dilute porosity limit can affect behaviour in the percolation threshold limit. We summarize the finite-element results in terms of simple structure-property relations, instead of tables of data, to make it easier to apply the computational results. Without using accurate numerical computations, one is limited to various effective medium theories and rigorous approximations like bounds and expansions. The accuracy of these equations is unknown for general porous media. To verify a particular theory it is important to check that it predicts both isotropic elastic moduli, i.e. prediction of Young's modulus alone is necessary but not sufficient. The subtleties of Poisson's ratio behaviour actually provide a very effective method for showing differences between the theories and demonstrating their ranges of validity. We find that for moderate- to high-porosity materials, none of the analytical theories is accurate and, at present, numerical techniques must be relied upon.
Resumo:
Recently, several groups have investigated quantum analogues of random walk algorithms, both on a line and on a circle. It has been found that the quantum versions have markedly different features to the classical versions. Namely, the variance on the line, and the mixing time on the circle increase quadratically faster in the quantum versions as compared to the classical versions. Here, we propose a scheme to implement the quantum random walk on a line and on a circle in an ion trap quantum computer. With current ion trap technology, the number of steps that could be experimentally implemented will be relatively small. However, we show how the enhanced features of these walks could be observed experimentally. In the limit of strong decoherence, the quantum random walk tends to the classical random walk. By measuring the degree to which the walk remains quantum, '' this algorithm could serve as an important benchmarking protocol for ion trap quantum computers.
Resumo:
Aim: The pseudo-Pelger-Huet (PH) anomaly has been associated with a variety of primary haematological disorders, infections and drugs. Recently, the development of dysgranulopoiesis characterised by a pseudo-PH anomaly has been reported in two patients with the use of mycophenolate mofetil (MMF) in the setting of heart and/or lung transplantation. We present a further five cases of MMF-related dysgranulopoiesis characterised by a pseudo-PH anomaly occurring after renal transplantation. Methods: All patients were receiving standard immunosuppression protocols for renal transplantation, including a combination of MMF, steroids and either cyclosporin or tacrolimus. Oral ganciclovir was also used for cytomegalovirus prophylaxis in each case. Results: Development of dysplastic granulopoiesis occurred a median of 96 days (range 66-196 days) after transplantation. Moderate or severe neutropaenia (
Resumo:
This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.
Resumo:
The first genetic linkage map of macadamia (Macadamia integrifolia and M. tetraphylla) is presented. The map is based on 56 F-1 progeny of cultivars 'Keauhou' and 'A16'. Eighty-four percent of the 382 markers analysed segregated as Mendelian loci. The two-way pseudo-testcross mapping strategy allowed construction of separate parental cultivar maps. Ninety bridging loci enabled merging of these maps to produce a detailed genetic map of macadamia, 1100 cM in length and spanning 70-80% of the genome. The combined map comprised 24 linkage groups with 265 framework markers: 259 markers from randomly amplified DNA fingerprinting (RAF), five random amplified polymorphic DNA (RAPD), and one sequence-tagged microsatellite site (STMS). The RAF marker system unexpectedly revealed 16 codominant markers, one of them a putative microsatellite locus and exhibiting four distinct alleles in the cross. This molecular study is the most comprehensive examination to date of genetic loci of macadamia, and is a major step towards developing marker-assisted selection for this crop.
Resumo:
Esta tese se propôs investigar a lógica inferencial das ações e suas significações em situações que mobilizam as noções de composição probabilística e acaso, bem como o papel dos modelos de significação no funcionamento cognitivo de adultos. Participaram 12 estudantes adultos jovens da classe popular, voluntários, de ambos os sexos, de um curso técnico integrado ao Ensino Médio da Educação de Jovens e Adultos. Foram realizados três encontros, individualmente, com registro em áudio e planilha eletrônica, utilizando-se dois jogos, o Likid Gaz e o Lucky Cassino, do software Missão Cognição (Haddad-Zubel, Pinkas & Pécaut, 2006), e o jogo Soma dos Dados (Silva, Rossetti & Cristo, 2012). Os procedimentos da tarefa foram adaptados de Silva e Frezza (2011): 1) apresentação do jogo; 2) execução do jogo; 3) entrevista semiestruturada; 4) aplicação de três situações-problema com intervenção segundo o Método Clínico; 5) nova partida do jogo; e 6) realização de outras duas situações-problema sem intervenção do Método Clínico. Elaboraram-se níveis de análise heurística, compreensão dos jogos e modelos de significação a partir da identificação de particularidades de procedimentos e significações nos jogos. O primeiro estudo examinou as implicações dos modelos de significação e representações prévias no pensamento do adulto, considerando que o sujeito organiza suas representações ou esquemas prévios relativos a um objeto na forma de modelos de significação em função do grau de complexidade e novidade da tarefa e de sua estrutura lógico matemática, que evoluem por meio do processo de equilibração; para o que precisa da demanda a significar esse aspecto da 13 realidade. O segundo estudo investigou a noção de combinação deduzível evidenciada no jogo Likid Gaz, identificando o papel dos modelos de significação na escolha dos procedimentos, implicando na rejeição de condutas de sistematização ou enumeração. Houve predominância dos níveis iniciais de análise heurística do jogo. O terceiro estudo examinou a noção de probabilidade observada no jogo Lucky Cassino, no qual a maioria dos participantes teve um nível de compreensão do jogo intermediário, com maior diversidade de modelos de significação em relação aos outros jogos, embora com predominância dos mais elementares. A síntese das noções de combinação, probabilidade e acaso foi explorada no quarto estudo pelo jogo Soma dos Dados (Silva, Rossetti & Cristo, 2012), identificando-se que uma limitação para adequada compreensão das ligações imbricadas nessas noções é a implicação significante – se aleatório A, então indeterminado D (notação A D), com construção de pseudonecessidades e pseudo-obrigações ou mesmo necessidades locais, generalizadas inapropriadamente. A resistência ou obstáculos do objeto deveria provocar perturbações, mas a estrutura cognitiva, o ambiente social e os modelos culturais, e a afetividade podem interferir nesse processo.
Resumo:
Topology optimization consists in finding the spatial distribution of a given total volume of material for the resulting structure to have some optimal property, for instance, maximization of structural stiffness or maximization of the fundamental eigenfrequency. In this paper a Genetic Algorithm (GA) employing a representation method based on trees is developed to generate initial feasible individuals that remain feasible upon crossover and mutation and as such do not require any repairing operator to ensure feasibility. Several application examples are studied involving the topology optimization of structures where the objective functions is the maximization of the stiffness and the maximization of the first and the second eigenfrequencies of a plate, all cases having a prescribed material volume constraint.