985 resultados para Pseudo-marginal method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of mechanical aspects related to cell activity and its environment is becoming more evident due to their influence in stem cell differentiation and in the development of diseases such as atherosclerosis. The mechanical tension homeostasis is related to normal tissue behavior and its lack may be related to the formation of cancer, which shows a higher mechanical tension. Due to the complexity of cellular activity, the application of simplified models may elucidate which factors are really essential and which have a marginal effect. The development of a systematic method to reconstruct the elements involved in the perception of mechanical aspects by the cell may accelerate substantially the validation of these models. This work proposes the development of a routine capable of reconstructing the topology of focal adhesions and the actomyosin portion of the cytoskeleton from the displacement field generated by the cell on a flexible substrate. Another way to think of this problem is to develop an algorithm to reconstruct the forces applied by the cell from the measurements of the substrate displacement, which would be characterized as an inverse problem. For these kind of problems, the Topology Optimization Method (TOM) is suitable to find a solution. TOM is consisted of an iterative application of an optimization method and an analysis method to obtain an optimal distribution of material in a fixed domain. One way to experimentally obtain the substrate displacement is through Traction Force Microscopy (TFM), which also provides the forces applied by the cell. Along with systematically generating the distributions of focal adhesion and actin-myosin for the validation of simplified models, the algorithm also represents a complementary and more phenomenological approach to TFM. As a first approximation, actin fibers and flexible substrate are represented through two-dimensional linear Finite Element Method. Actin contraction is modeled as an initial stress of the FEM elements. Focal adhesions connecting actin and substrate are represented by springs. The algorithm was applied to data obtained from experiments regarding cytoskeletal prestress and micropatterning, comparing the numerical results to the experimental ones

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geometric nonlinearities of flexure hinges introduced by large deflections often complicate the analysis of compliant mechanisms containing such members, and therefore, Pseudo-Rigid-Body Models (PRBMs) have been well proposed and developed by Howell [1994] to analyze the characteristics of slender beams under large deflection. These models, however, fail to approximate the characteristics for the deep beams (short beams) or the other flexure hinges. Lobontiu's work [2001] contributed to the diverse flexure hinge analysis building on the assumptions of small deflection, which also limits the application range of these flexure hinges and cannot analyze the stiffness and stress characteristics of these flexure hinges for large deflection. Therefore, the objective of this thesis is to analyze flexure hinges considering both the effects of large-deflection and shear force, which guides the design of flexure-based compliant mechanisms. The main work conducted in the thesis is outlined as follows. 1. Three popular types of flexure hinges: (circular flexure hinges, elliptical flexure hinges and corner-filleted flexure hinges) are chosen for analysis at first. 2. Commercial software (Comsol) based Finite Element Analysis (FEA) method is then used for correcting the errors produced by the equations proposed by Lobontiu when the chosen flexure hinges suffer from large deformation. 3. Three sets of generic design equations for the three types of flexure hinges are further proposed on the basis of stiffness and stress characteristics from the FEA results. 4. A flexure-based four-bar compliant mechanism is finally studied and modeled using the proposed generic design equations. The load-displacement relationships are verified by a numerical example. The results show that a maximum error about the relationship between moment and rotation deformation is less than 3.4% for a flexure hinge, and it is lower than 5% for the four-bar compliant mechanism compared with the FEA results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the large maturity limit, we compute explicitly the Local Volatility surface for Heston, through Dupire’s formula, with Fourier pricing of the respective derivatives of the call price. Than we verify that the prices of European call options produced by the Heston model, concide with those given by the local volatility model where the Local Volatility is computed as said above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the well-known method of frames approach to the signal decomposition problem is reformulated as a certain bilevel goal-attainment linear least squares problem. As a consequence, a numerically robust variant of the method, named approximating method of frames, is proposed on the basis of a certain minimal Euclidean norm approximating splitting pseudo-iteration-wise method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To improve our understanding of the Asian monsoon system, we developed a hydroclimate reconstruction in a marginal monsoon shoulder region for the period prior to the industrial era. Here, we present the first moisture sensitive tree-ring chronology, spanning 501 years for the Dieshan Mountain area, a boundary region of the Asian summer monsoon in the northeastern Tibetan Plateau. This reconstruction was derived from 101 cores of 68 old-growth Chinese pine (Pinus tabulaeformis) trees. We introduce a Hilbert–Huang Transform (HHT) based standardization method to develop the tree-ring chronology, which has the advantages of excluding non-climatic disturbances in individual tree-ring series. Based on the reliable portion of the chronology, we reconstructed the annual (prior July to current June) precipitation history since 1637 for the Dieshan Mountain area and were able to explain 41.3% of the variance. The extremely dry years in this reconstruction were also found in historical documents and are also associated with El Niño episodes. Dry periods were reconstructed for 1718–1725, 1766–1770 and 1920–1933, whereas 1782–1788 and 1979–1985 were wet periods. The spatial signatures of these events were supported by data from other marginal regions of the Asian summer monsoon. Over the past four centuries, out-of-phase relationships between hydroclimate variations in the Dieshan Mountain area and far western Mongolia were observed during the 1718–1725 and 1766–1770 dry periods and the 1979–1985 wet period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 °C, whereas LOI of pure graphite at 530 °C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 °C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 °C samples in the centre of the furnace lost more weight than marginal samples. At 950 °C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 °C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 °C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 °C being more susceptible to these factors than LOI at 950 °C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the repercussion effects on the production cost of industries in Asian countries when some countries eliminate tariffs and import commodity taxes on all imports. This kind of analysis is related in some sense to that measuring the effects of FTAs on economies, and thus may be considered as an analysis of “pseudo FTAs.” Examining a number of combinations of “pseudo FTAs” between China, Japan, and ASEAN, it is found that the case of China plus Japan plus ASEAN is the most effective “pseudo FTA” of the combinations in terms of production cost reduction. The method is a form of price model based on the Asian International Input-Output Table. Almost no studies on price models related to multilateral I/O tables have been implemented thus far.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three methodologies to assess As bioaccessibility were evaluated using playgroundsoil collected from 16 playgrounds in Madrid, Spain: two (Simplified Bioaccessibility Extraction Test: SBET, and hydrochloric acid-extraction: HCl) assess gastric-only bioaccessibility and the third (Physiologically Based Extraction Test: PBET) evaluates mouth–gastric–intestinal bioaccessibility. Aqua regia-extractable (pseudo total) As contents, which are routinely employed in riskassessments, were used as the reference to establish the following percentages of bioaccessibility: SBET – 63.1; HCl – 51.8; PBET – 41.6, the highest values associated with the gastric-only extractions. For Madridplaygroundsoils – characterised by a very uniform, weakly alkaline pH, and low Fe oxide and organic matter contents – the statistical analysis of the results indicates that, in contrast with other studies, the highest percentage of As in the samples was bound to carbonates and/or present as calcium arsenate. As opposed to the As bound to Fe oxides, this As is readily released in the gastric environment as the carbonate matrix is decomposed and calcium arsenate is dissolved, but some of it is subsequently sequestered in unavailable forms as the pH is raised to 5.5 to mimic intestinal conditions. The HCl extraction can be used as a simple and reliable (i.e. low residual standard error) proxy for the more expensive, time consuming, and error-prone PBET methodology. The HCl method would essentially halve the estimate of carcinogenic risk for children playing in Madridplaygroundsoils, providing a more representative value of associated risk than the pseudo-total concentrations used at present

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pseudo-Dynamic Test Method (PDTM) is being developped currently as an alternative to the shaking table testing of large size models. However, the stepped slow execution of the former type of test has been found to be the source of important errors arising from the stress relaxation. A new continuous test method, wich allows the selection of a suitable time-scale factor in the response in order to control these errors, es proposed here. Such scaled-time response is theoretically obtained by simply augmenting the mass of the structure for wich some practical solutions are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stepped and excessively slow execution of pseudo-dynamic tests has been found to be the source of some errors arising from strain-rate effect and stress relaxation. In order to control those errors, a new continuous test method which allows the selection of a more suitable time scale factor in the response is proposed in this work. By dimensional analysis, such scaled-time response is obtained theoretically by augmenting the inertial and damping properties of the structure, for which we propose the use of hydraulic pistons which are servo-controlled to produce active mass and damping, nevertheless using an equipment which is similar to that required in a pseudo-dynamic test. The results of the successful implementation of this technique for a simple specimen are shown here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-parametric belief propagation (NBP) is a well-known message passing method for cooperative localization in wireless networks. However, due to the over-counting problem in the networks with loops, NBP’s convergence is not guaranteed, and its estimates are typically less accurate. One solution for this problem is non-parametric generalized belief propagation based on junction tree. However, this method is intractable in large-scale networks due to the high-complexity of the junction tree formation, and the high-dimensionality of the particles. Therefore, in this article, we propose the non-parametric generalized belief propagation based on pseudo-junction tree (NGBP-PJT). The main difference comparing with the standard method is the formation of pseudo-junction tree, which represents the approximated junction tree based on thin graph. In addition, in order to decrease the number of high-dimensional particles, we use more informative importance density function, and reduce the dimensionality of the messages. As by-product, we also propose NBP based on thin graph (NBP-TG), a cheaper variant of NBP, which runs on the same graph as NGBP-PJT. According to our simulation and experimental results, NGBP-PJT method outperforms NBP and NBP-TG in terms of accuracy, computational, and communication cost in reasonably sized networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pseudo-ternary diagrams for Quil A, phospholipid (phosphatidylcholine (PC) or phosphatidylethanolamine (PE)) and cholesterol were established in order to identify combinations that result in the formation of immune-stimulating complex (ISCOM) matrices and other colloidal structures produced by these three components in aqueous systems following lipid-film hydration or dialysis (methods that can be used to produce ISCOMs). In addition, the effect of equilibration time (1 month at 4degreesC) on the structures formed by the various combinations of the three components was investigated. Depending on the ratio of Quil A, cholesterol and phospholipid, different colloidal particles, including ISCOM matrices, liposomes and ring-like micelles, were found irrespective of the preparation method used. In contrast, worm-like micelles were only observed in systems prepared by lipid-film hydration. For samples prepared by dialysis, ISCOM matrices were predominantly found near the Quil A apex of the pseudo-ternary diagram (> 50% Quil A). On the other hand, for samples prepared by lipid-film hydration, ISCOM matrices were predominantly found near the phospholipid apex of the pseudo-ternary diagram (> 50% phospholipid). The regions in the pseudo-ternary diagrams in which ISCOM matrices were observed increased following an extended equilibration time, particularly for samples prepared by lipid-film hydration. Differences were also observed between pseudoternary diagrams prepared using either PE or PC as phospholipids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existence of undesirable electricity price spikes in a competitive electricity market requires an efficient auction mechanism. However, many of the existing auction mechanism have difficulties in suppressing such unreasonable price spikes effectively. A new auction mechanism is proposed to suppress effectively unreasonable price spikes in a competitive electricity market. It optimally combines system marginal price auction and pay as bid auction mechanisms. A threshold value is determined to activate the switching between the marginal price auction and the proposed composite auction. Basically when the system marginal price is higher than the threshold value, the composite auction for high price electricity market is activated. The winning electricity sellers will sell their electricity at the system marginal price or their own bid prices, depending on their rights of being paid at the system marginal price and their offers' impact on suppressing undesirable price spikes. Such economic stimuli discourage sellers from practising economic and physical withholdings. Multiple price caps are proposed to regulate strong market power. We also compare other auction mechanisms to highlight the characteristics of the proposed one. Numerical simulation using the proposed auction mechanism is given to illustrate the procedure of this new auction mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.