146 resultados para R-Statistical computing
em University of Queensland eSpace - Australia
Resumo:
Although many of the molecular interactions in kidney development are now well understood, the molecules involved in the specification of the metanephric mesenchyme from surrounding intermediate mesoderm and, hence, the formation of the renal progenitor population are poorly characterized. In this study, cDNA microarrays were used to identify genes enriched in the murine embryonic day 10.5 (E10.5) uninduced metanephric mesenchyme, the renal progenitor population, in comparison with more rostral derivatives of the intermediate mesoderm. Microarray data were analyzed using R statistical software to determine accurately genes differentially expressed between these populations. Microarray outliers were biologically verified, and the spatial expression pattern of these genes at E10.5 and subsequent stages of early kidney development was determined by RNA in situ hybridization. This approach identified 21 genes preferentially expressed by the E10.5 metanephric mesenchyme, including Ewing sarcoma homolog, 14-3-3 theta, retinoic acid receptor-alpha, stearoyl-CoA desaturase 2, CD24, and cadherin-11, that may be important in formation of renal progenitor cells. Cell surface proteins such as CD24 and cadherin-11 that were strongly and specifically expressed in the uninduced metanephric mesenchyme and mark the renal progenitor population may prove useful in the purification of renal progenitor cells by FACS. These findings may assist in the isolation and characterization of potential renal stem cells for use in cellular therapies for kidney disease.
Resumo:
The main problem with current approaches to quantum computing is the difficulty of establishing and maintaining entanglement. A Topological Quantum Computer (TQC) aims to overcome this by using different physical processes that are topological in nature and which are less susceptible to disturbance by the environment. In a (2+1)-dimensional system, pseudoparticles called anyons have statistics that fall somewhere between bosons and fermions. The exchange of two anyons, an effect called braiding from knot theory, can occur in two different ways. The quantum states corresponding to the two elementary braids constitute a two-state system allowing the definition of a computational basis. Quantum gates can be built up from patterns of braids and for quantum computing it is essential that the operator describing the braiding-the R-matrix-be described by a unitary operator. The physics of anyonic systems is governed by quantum groups, in particular the quasi-triangular Hopf algebras obtained from finite groups by the application of the Drinfeld quantum double construction. Their representation theory has been described in detail by Gould and Tsohantjis, and in this review article we relate the work of Gould to TQC schemes, particularly that of Kauffman.
Resumo:
We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.
Resumo:
The effect of number of samples and selection of data for analysis on the calculation of surface motor unit potential (SMUP) size in the statistical method of motor unit number estimates (MUNE) was determined in 10 normal subjects and 10 with amyotrophic lateral sclerosis (ALS). We recorded 500 sequential compound muscle action potentials (CMAPs) at three different stable stimulus intensities (10–50% of maximal CMAP). Estimated mean SMUP sizes were calculated using Poisson statistical assumptions from the variance of 500 sequential CMAP obtained at each stimulus intensity. The results with the 500 data points were compared with smaller subsets from the same data set. The results using a range of 50–80% of the 500 data points were compared with the full 500. The effect of restricting analysis to data between 5–20% of the CMAP and to standard deviation limits was also assessed. No differences in mean SMUP size were found with stimulus intensity or use of different ranges of data. Consistency was improved with a greater sample number. Data within 5% of CMAP size gave both increased consistency and reduced mean SMUP size in many subjects, but excluded valid responses present at that stimulus intensity. These changes were more prominent in ALS patients in whom the presence of isolated SMUP responses was a striking difference from normal subjects. Noise, spurious data, and large SMUP limited the Poisson assumptions. When these factors are considered, consistent statistical MUNE can be calculated from a continuous sequence of data points. A 2 to 2.5 SD or 10% window are reasonable methods of limiting data for analysis. Muscle Nerve 27: 320–331, 2003
Resumo:
An important aspect in manufacturing design is the distribution of geometrical tolerances so that an assembly functions with given probability, while minimising the manufacturing cost. This requires a complex search over a multidimensional domain, much of which leads to infeasible solutions and which can have many local minima. As well, Monte-Carlo methods are often required to determine the probability that the assembly functions as designed. This paper describes a genetic algorithm for carrying out this search and successfully applies it to two specific mechanical designs, enabling comparisons of a new statistical tolerancing design method with existing methods. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
The compelling quality of the Global Change simulation study (Altemeyer, 2003), in which high RWA (right-wing authoritarianism)/high SDO (social dominance orientation) individuals produced poor outcomes for the planet, rests on the inference that the link between high RWA/SDO scores and disaster in the simulation can be generalized to real environmental and social situations. However, we argue that studies of the Person × Situation interaction are biased to overestimate the role of the individual variability. When variables are operationalized, strongly normative items are excluded because they are skewed and kurtotic. This occurs both in the measurement of predictor constructs, such as RWA, and in the outcome constructs, such as prejudice and war. Analyses of normal linear statistics highlight personality variables such as RWA, which produce variance, and overlook the role of norms, which produce invariance. Where both normative and personality forces are operating, as in intergroup contexts, the linear analysis generates statistics for the sample that disproportionately reflect the behavior of the deviant, antinormative minority and direct attention away from the baseline, normative position. The implications of these findings for the link between high RWA and disaster are discussed.
Resumo:
We show how the measurement induced model of quantum computation proposed by Raussendorf and Briegel ( 2001, Phys. Rev. Letts., 86, 5188) can be adapted to a nonlinear optical interaction. This optical implementation requires a Kerr nonlinearity, a single photon source, a single photon detector and fast feed forward. Although nondeterministic optical quantum information proposals such as that suggested by KLM ( 2001, Nature, 409, 46) do not require a Kerr nonlinearity they do require complex reconfigurable optical networks. The proposal in this paper has the benefit of a single static optical layout with fixed device parameters, where the algorithm is defined by the final measurement procedure.
Resumo:
Solid-state quantum computer architectures with qubits encoded using single atoms are now feasible given recent advances in the atomic doping of semiconductors. Here we present a charge qubit consisting of two dopant atoms in a semiconductor crystal, one of which is singly ionized. Surface electrodes control the qubit and a radio-frequency single-electron transistor provides fast readout. The calculated single gate times, of order 50 ps or less, are much shorter than the expected decoherence time. We propose universal one- and two-qubit gate operations for this system and discuss prospects for fabrication and scale up.
Resumo:
A parallel computing environment to support optimization of large-scale engineering systems is designed and implemented on Windows-based personal computer networks, using the master-worker model and the Parallel Virtual Machine (PVM). It is involved in decomposition of a large engineering system into a number of smaller subsystems optimized in parallel on worker nodes and coordination of subsystem optimization results on the master node. The environment consists of six functional modules, i.e. the master control, the optimization model generator, the optimizer, the data manager, the monitor, and the post processor. Object-oriented design of these modules is presented. The environment supports steps from the generation of optimization models to the solution and the visualization on networks of computers. User-friendly graphical interfaces make it easy to define the problem, and monitor and steer the optimization process. It has been verified by an example of a large space truss optimization. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
The Accelerating Moment Release (AMR) preceding earthquakes with magnitude above 5 in Australia that occurred during the last 20 years was analyzed to test the Critical Point Hypothesis. Twelve earthquakes in the catalog were chosen based on a criterion for the number of nearby events. Results show that seven sequences with numerous events recorded leading up to the main earthquake exhibited accelerating moment release. Two occurred near in time and space to other earthquakes preceded by AM R. The remaining three sequences had very few events in the catalog so the lack of AMR detected in the analysis may be related to catalog incompleteness. Spatio-temporal scanning of AMR parameters shows that 80% of the areas in which AMR occurred experienced large events. In areas of similar background seismicity with no large events, 10 out of 12 cases exhibit no AMR, and two others are false alarms where AMR was observed but no large event followed. The relationship between AMR and Load-Unload Response Ratio (LURR) was studied. Both methods predict similar critical region sizes, however, the critical point time using AMR is slightly earlier than the time of the critical point LURR anomaly.
Resumo:
In this paper we explore the possibility of fundamental tests for coherent-state optical quantum computing gates [ T. C. Ralph et al. Phys. Rev. A 68 042319 (2003)] using sophisticated but not unrealistic quantum states. The major resource required in these gates is a state diagonal to the basis states. We use the recent observation that a squeezed single-photon state [S(r)∣1⟩] approximates well an odd superposition of coherent states (∣α⟩−∣−α⟩) to address the diagonal resource problem. The approximation only holds for relatively small α, and hence these gates cannot be used in a scalable scheme. We explore the effects on fidelities and probabilities in teleportation and a rotated Hadamard gate.