190 resultados para dipole approximation technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the boundaries between public and private, human and technology, digital and social, mediated and natural, online and offline become increasingly blurred in modern techno-social hybrid societies, sociology as a discipline needs to adapt and adopt new ways of accounting for these digital cultures. In this paper I use the social networking site Pinterest to demonstrate how people today are shaped by, and in turn shape, the digital tools they are assembled with. Digital sociology is emerging as a sociological subdiscipline that engages with the convergence of the digital and the social. However, there seems to be a focus on developing new methods for studying digital social life, yet a neglect of concrete explorations of its culture. I argue for the need for critical socio-cultural ‘thick description’ to account for the interrelations between humans and technologies in modern digitally mediated cultures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Condensation technique of degree of freedom is first proposed to improve the computational efficiency of meshfree method with Galerkin weak form for elastic dynamic analysis. In the present method, scattered nodes without connectivity are divided into several subsets by cells with arbitrary shape. Local discrete equation is established over each cell by using moving Kriging interpolation, in which the nodes that located in the cell are used for approximation. Then local discrete equations can be simplified by condensation of degree of freedom, which transfers equations of inner nodes to equations of boundary nodes based on cells. The global dynamic system equations are obtained by assembling all local discrete equations and are solved by using the standard implicit Newmark’s time integration scheme. In the scheme of present method, the calculation of each cell is carried out by meshfree method, and local search is implemented in interpolation. Numerical examples show that the present method has high computational efficiency and good accuracy in solving elastic dynamic problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unbalanced or non-linear loads result in distorted stator currents and electromagnetic torque pulsations in stand-alone doubly fed induction generators (DFIGs). This study proposes the use of a proportional-integral repetitive control (PIRC) scheme so as to mitigate the levels of harmonic and unbalance at the stator terminals of the DFIG. The PIRC is structurally simpler and requires much less computation than existing methods. Analysis of the PIRC operation and the methodology to determine the control parameters is included. Simulation study as well as laboratory test measurements demonstrate clearly the effectiveness of the proposed PIRC control scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microscopic surface diffusivity theory based on atomic ionization energy concept is developed to explain the variations of the atomic and displacement polarizations with respect to the surface diffusion activation energy of adatoms in the process of self-assembly of quantum dots on plasma-exposed surfaces. These polarizations are derived classically, while the atomic polarization is quantized to obtain the microscopic atomic polarizability. The surface diffusivity equation is derived as a function of the ionization energy. The results of this work can be used to fine-tune the delivery rates of different adatoms onto nanostructure growth surfaces and optimize the low-temperature plasma based nanoscale synthesis processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses of the advanced computational technique of steel structures for both simulation capacities simultaneously; specifically, they are the higher-order element formulation with element load effect (geometric nonlinearities) as well as the refined plastic hinge method (material nonlinearities). This advanced computational technique can capture the real behaviour of a whole second-order inelastic structure, which in turn ensures the structural safety and adequacy of the structure. Therefore, the emphasis of this paper is to advocate that the advanced computational technique can replace the traditional empirical design approach. In the meantime, the practitioner should be educated how to make use of the advanced computational technique on the second-order inelastic design of a structure, as this approach is the future structural engineering design. It means the future engineer should understand the computational technique clearly; realize the behaviour of a structure with respect to the numerical analysis thoroughly; justify the numerical result correctly; especially the fool-proof ultimate finite element is yet to come, of which is competent in modelling behaviour, user-friendly in numerical modelling and versatile for all structural forms and various materials. Hence the high-quality engineer is required, who can confidently manipulate the advanced computational technique for the design of a complex structure but not vice versa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Angular distribution of microscopic ion fluxes around nanotubes arranged into a dense ordered pattern on the surface of the substrate is studied by means of multiscale numerical simulation. The Monte Carlo technique was used to show that the ion current density is distributed nonuniformly around the carbon nanotubes arranged into a dense rectangular array. The nonuniformity factor of the ion current flux reaches 7 in dense (5× 1018 m-3) plasmas for a nanotube radius of 25 nm, and tends to 1 at plasma densities below 1× 1017 m-3. The results obtained suggest that the local density of carbon adatoms on the nanotube side surface, at areas facing the adjacent nanotubes of the pattern, can be high enough to lead to the additional wall formation and thus cause the single- to multiwall structural transition, and other as yet unexplained nanoscience phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dual-energy X-ray absorptiometry (DXA) and isotope dilution technique have been used as reference methods to validate the estimates of body composition by simple field techniques; however, very few studies have compared these two methods. We compared the estimates of body composition by DXA and isotope dilution (18O) technique in apparently healthy Indian men and women (aged 19–70 years, n 152, 48 % men) with a wide range of BMI (14–40 kg/m2). Isotopic enrichment was assessed by isotope ratio mass spectroscopy. The agreement between the estimates of body composition measured by the two techniques was assessed by the Bland–Altman method. The mean age and BMI were 37 (SD 15) years and 23·3 (SD 5·1) kg/m2, respectively, for men and 37 (SD 14) years and 24·1 (SD 5·8) kg/m2, respectively, for women. The estimates of fat-free mass were higher by about 7 (95 % CI 6, 9) %, those of fat mass were lower by about 21 (95 % CI 218,223) %, and those of body fat percentage (BF%) were lower by about 7·4 (95 % CI 28·2, 26·6) % as obtained by DXA compared with the isotope dilution technique. The Bland–Altman analysis showed wide limits of agreement that indicated poor agreement between the methods. The bias in the estimates of BF% was higher at the lower values of BF%. Thus, the two commonly used reference methods showed substantial differences in the estimates of body composition with wide limits of agreement. As the estimates of body composition are method-dependent, the two methods cannot be used interchangeably

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study decomposed the determinants of environmental quality into scale, technique, and composition effects. We applied a semiparametric method of generalized additive models, which enabled us to use flexible functional forms and include several independent variables in the model. The differences in the technique effect were found to play a crucial role in reducing pollution. We found that the technique effect was sufficient to reduce sulfur dioxide emissions. On the other hand, its effect was not enough to reduce carbon dioxide (CO2) emissions and energy use, except for the case of CO2 emissions in high-income countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment , should be appropriately modelled in order to create the user profiles [1]. Secondly, the semantics behind the tags should be considered properly as the flexibility with their design can cause semantic problems such as synonymy and polysemy [2]. This research proposes to address these two challenges for building a tag-based item recommendation system by employing tensor modeling as the multi-dimensional user profile approach, and the topic model as the semantic analysis approach. The first objective is to optimize the tensor model reconstruction and to improve the model performance in generating quality rec-ommendation. A novel Tensor-based Recommendation using Probabilistic Ranking (TRPR) method [3] has been developed. Results show this method to be scalable for large datasets and outperforming the benchmarking methods in terms of accuracy. The memory efficient loop implements the n-mode block-striped (matrix) product for tensor reconstruction as an approximation of the initial tensor. The probabilistic ranking calculates the probabil-ity of users to select candidate items using their tag preference list based on the entries generated from the reconstructed tensor. The second objective is to analyse the tag semantics and utilize the outcome in building the tensor model. This research proposes to investigate the problem using topic model approach to keep the tags nature as the “social vocabulary” [4]. For the tag assignment data, topics can be generated from the occurrences of tags given for an item. However there is only limited amount of tags availa-ble to represent items as collection of topics, since an item might have only been tagged by using several tags. Consequently, the generated topics might not able to represent the items appropriately. Furthermore, given that each tag can belong to any topics with various probability scores, the occurrence of tags cannot simply be mapped by the topics to build the tensor model. A standard weighting technique will not appropriately calculate the value of tagging activity since it will define the context of an item using a tag instead of a topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the Bayesian framework a standard approach to model criticism is to compare some function of the observed data to a reference predictive distribution. The result of the comparison can be summarized in the form of a p-value, and it's well known that computation of some kinds of Bayesian predictive p-values can be challenging. The use of regression adjustment approximate Bayesian computation (ABC) methods is explored for this task. Two problems are considered. The first is the calibration of posterior predictive p-values so that they are uniformly distributed under some reference distribution for the data. Computation is difficult because the calibration process requires repeated approximation of the posterior for different data sets under the reference distribution. The second problem considered is approximation of distributions of prior predictive p-values for the purpose of choosing weakly informative priors in the case where the model checking statistic is expensive to compute. Here the computation is difficult because of the need to repeatedly sample from a prior predictive distribution for different values of a prior hyperparameter. In both these problems we argue that high accuracy in the computations is not required, which makes fast approximations such as regression adjustment ABC very useful. We illustrate our methods with several samples.