796 resultados para Halliday, Jon: Mao


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present error analysis for a Monte Carlo algorithm for evaluating bilinear forms of matrix powers. An almost Optimal Monte Carlo (MAO) algorithm for solving this problem is formulated. Results for the structure of the probability error are presented and the construction of robust and interpolation Monte Carlo algorithms are discussed. Results are presented comparing the performance of the Monte Carlo algorithm with that of a corresponding deterministic algorithm. The two algorithms are tested on a well balanced matrix and then the effects of perturbing this matrix, by small and large amounts, is studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we deal with performance analysis of Monte Carlo algorithm for large linear algebra problems. We consider applicability and efficiency of the Markov chain Monte Carlo for large problems, i.e., problems involving matrices with a number of non-zero elements ranging between one million and one billion. We are concentrating on analysis of the almost Optimal Monte Carlo (MAO) algorithm for evaluating bilinear forms of matrix powers since they form the so-called Krylov subspaces. Results are presented comparing the performance of the Robust and Non-robust Monte Carlo algorithms. The algorithms are tested on large dense matrices as well as on large unstructured sparse matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we analyse applicability and robustness of Markov chain Monte Carlo algorithms for eigenvalue problems. We restrict our consideration to real symmetric matrices. Almost Optimal Monte Carlo (MAO) algorithms for solving eigenvalue problems are formulated. Results for the structure of both - systematic and probability error are presented. It is shown that the values of both errors can be controlled independently by different algorithmic parameters. The results present how the systematic error depends on the matrix spectrum. The analysis of the probability error is presented. It shows that the close (in some sense) the matrix under consideration is to the stochastic matrix the smaller is this error. Sufficient conditions for constructing robust and interpolation Monte Carlo algorithms are obtained. For stochastic matrices an interpolation Monte Carlo algorithm is constructed. A number of numerical tests for large symmetric dense matrices are performed in order to study experimentally the dependence of the systematic error from the structure of matrix spectrum. We also study how the probability error depends on the balancing of the matrix. (c) 2007 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider bilinear forms of matrix polynomials and show that these polynomials can be used to construct solutions for the problems of solving systems of linear algebraic equations, matrix inversion and finding extremal eigenvalues. An almost Optimal Monte Carlo (MAO) algorithm for computing bilinear forms of matrix polynomials is presented. Results for the computational costs of a balanced algorithm for computing the bilinear form of a matrix power is presented, i.e., an algorithm for which probability and systematic errors are of the same order, and this is compared with the computational cost for a corresponding deterministic method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is remarkable agreement in expectations today for vastly improved ocean data management a decade from now -- capabilities that will help to bring significant benefits to ocean research and to society. Advancing data management to such a degree, however, will require cultural and policy changes that are slow to effect. The technological foundations upon which data management systems are built are certain to continue advancing rapidly in parallel. These considerations argue for adopting attitudes of pragmatism and realism when planning data management strategies. In this paper we adopt those attitudes as we outline opportunities for progress in ocean data management. We begin with a synopsis of expectations for integrated ocean data management a decade from now. We discuss factors that should be considered by those evaluating candidate “standards”. We highlight challenges and opportunities in a number of technical areas, including “Web 2.0” applications, data modeling, data discovery and metadata, real-time operational data, archival of data, biological data management and satellite data management. We discuss the importance of investments in the development of software toolkits to accelerate progress. We conclude the paper by recommending a few specific, short term targets for implementation, that we believe to be both significant and achievable, and calling for action by community leadership to effect these advancements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measurements of body weight, total body water and total body potassium (40K) were made serially on three occasions during pregnancy and once post partum in 27 normal pregnant women. Skinfold thickness and fat cell diameter were also measured. A model of body composition was formulated to permit the estimation of changes in fat, lean tissue and water content of the maternal body. Total maternal body fat increased during pregnancy, reaching a peak towards the end of the second trimester before diminishing. Serial measurements of fat cell diameter showed poor correlation, whilst total body fat calculated from skinfold thickness correlated well with our estimated values for total body fat in pregnancy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Liver X receptors (LXRs) are transcription factors involved in the regulation of cholesterol homeostasis. LXR ligands have athero-protective properties independent of their effects on cholesterol metabolism. Platelets are involved in the initiation of atherosclerosis and despite being anucleate express nuclear receptors. We hypothesized that the athero-protective effects of LXR ligands could be in part mediated through platelets and therefore explored the potential role of LXR in platelets. Our results show that LXR-β is present in human platelets and the LXR ligands, GW3965 and T0901317, modulated nongenomically platelet aggregation stimulated by a range of agonists. GW3965 caused LXR to associate with signaling components proximal to the collagen receptor, GPVI, suggesting a potential mechanism of LXR action in platelets that leads to diminished platelet responses. Activation of platelets at sites of atherosclerotic lesions results in thrombosis preceding myocardial infarction and stroke. Using an in vivo model of thrombosis in mice, we show that GW3965 has antithrombotic effects, reducing the size and the stability of thrombi. The athero-protective effects of GW3965, together with its novel antiplatelet/thrombotic effects, indicate LXR as a potential target for prevention of athero-thrombotic disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Serine proteases are major components of viper venom and target various stages of the blood coagulation system in victims and prey. A better understanding of the diversity of serine proteases and other enzymes present in snake venom will help to understand how the complexity of snake venom has evolved and will aid the development of novel therapeutics for treating snake bites. Methodology and Principal Findings: Four serine protease-encoding genes from the venom gland transcriptome of Bitis gabonica rhinoceros were amplified and sequenced. Mass spectrometry suggests the four enzymes corresponding to these genes are present in the venom of B. g. rhinoceros. Two of the enzymes, rhinocerases 2 and 3 have substitutions to two of the serine protease catalytic triad residues and are thus unlikely to be catalytically active, though they may have evolved other toxic functions. The other two enzymes, rhinocerases 4 and 5, have classical serine protease catalytic triad residues and thus are likely to be catalytically active, however they have glycine rather than the more typical aspartic acid at the base of the primary specificity pocket (position 189). Based on a detailed analysis of these sequences we suggest that alternative splicing together with individual amino acid mutations may have been involved in their evolution. Changes within amino acid segments which were previously proposed to undergo accelerated change in venom serine proteases have also been observed. Conclusions and Significance: Our study provides further insight into the diversity of serine protease isoforms present within snake venom and discusses their possible functions and how they may have evolved. These multiple serine protease isoforms with different substrate specificities may enhance the envenomation effects and help the snake to adapt to new habitats and diets. Our findings have potential for helping the future development of improved therapeutics for snake bites.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing research on synchronous remote working in CSCW has highlighted the troubles that can arise because actions at one site are (partially) unavailable to remote colleagues. Such ‘local action’ is routinely characterised as a nuisance, a distraction, subordinate and the like. This paper explores interconnections between ‘local action’ and ‘distributed work’ in the case of a research team virtually collocated through ‘MiMeG’. MiMeG is an e-Social Science tool that facilitates ‘distributed data sessions’ in which social scientists are able to remotely collaborate on the real-time analysis of video data. The data are visible and controllable in a shared workspace and participants are additionally connected via audio conferencing. The findings reveal that whilst the (partial) unavailability of local action is at times problematic, it is also used as a resource for coordinating work. The paper considers how local action is interactionally managed in distributed data sessions and concludes by outlining implications of the analysis for the design and study of technologies to support group-to-group collaboration.