71 resultados para Reproducing Kernel


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particulate systems are of interest in many disciplines. They are often investigated using the discrete element method because of its capability to investigate particulate systems at the individual particle scale. To model the interaction between two particles and between a particle and a boundary, conventional discrete element models use springs and dampers in both the normal and tangential directions. The significance of particle rotation has been highlighted in both numerical studies and physical experiments. Several researchers have attempted to incorporate a rotational torque to account for the rolling resistance or rolling friction by developing different models. This paper presents a review of the commonly used models for rolling resistance and proposes a more general model. These models are classified into four categories according to their key characteristics. The robustness of these models in reproducing rolling resistance effects arising from different physical situations was assessed by using several benchmarking test cases. The proposed model can be seen to be more general and suitable for modelling problems involving both dynamic and pseudo-static regimes. An example simulation of the formation of a 2D sandpile is also shown. For simplicity, all formulations and examples are presented in 2D form, though the general conclusions are also applicable to 3D systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to add to current discussions on the use of Lacanian psychoanalysis in organizational change. Specifically, It argues that critiques of Lacan's work must be acknowledged and incorporated into these discussions. To date, there remains a silence surrounding these critiques within organization studies.
Design/methodology/approach: The paper presents the existing studies that draw upon Lacan's work in the context of organizational change initiatives. It highlights the value of this theory. Next, it outlines critiques of Lacan's concepts of phallus and incest taboo, and show how these concepts can be exclusionary.
Findings: The paper finds that there remains little debate within organization studies around such critiques. Lacan tends to be employed in ways that risk reproducing particular, exclusionary aspects of his theory. A homophobic and patriarchal legacy persists in appropriations of his writing. It outlines alternative ways of reading Lacan, which aim to avoid such exclusions. It shows how introducing such alternatives is a difficult project, first, given the silence surrounding critiques of Lacan in the organizational change literature. Second, following Foucault, It argues that language has power: a patriarchal schema is self-reinforcing in its persistence within a particular discipline, and thus difficult to dislodge.
Research limitations/implications: Given these findings, the paper concludes that organization theorists and practitioners ought to engage with critiques of Lacan's work, when employing it in their own. The silence surrounding such legacies is dangerous. It argues that the first step in engaging with Lacan's work should be to give voice to such critiques, if his writing is to be employed in the practice and study of organizational change.
Originality/value: This paper provides a unique engagement with Lacan's work in the context of the study and practice of organizational change interventions. It presents an evaluation of well-known critiques and useful recommendations for theorists and practitioners considering a Lacanian approach to this area of management studies. © Emerald Group Publishing Limited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The non-thermal particle spectra responsible for the emission from many astrophysical systems are thought to originate from shocks via a first order Fermi process otherwise known as diffusive shock acceleration. The same mechanism is also widely believed to be responsible for the production of high energy cosmic rays. With the growing interest in collisionless shock physics in laser produced plasmas, the possibility of reproducing and detecting shock acceleration in controlled laboratory experiments should be considered. The various experimental constraints that must be satisfied are reviewed. It is demonstrated that several currently operating laser facilities may fulfil the necessary criteria to confirm the occurrence of diffusive shock acceleration of electrons at laser produced shocks. Successful reproduction of Fermi acceleration in the laboratory could open a range of possibilities, providing insight into the complex plasma processes that occur near astrophysical sources of cosmic rays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let X be a connected, noetherian scheme and A{script} be a sheaf of Azumaya algebras on X, which is a locally free O{script}-module of rank a. We show that the kernel and cokernel of K(X) ? K(A{script}) are torsion groups with exponent a for some m and any i = 0, when X is regular or X is of dimension d with an ample sheaf (in this case m = d + 1). As a consequence, K(X, Z/m) ? K(A{script}, Z/m), for any m relatively prime to a. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microwave heating reduces the preparation time and improves the adsorption quality of activated carbon. In this study, activated carbon was prepared by impregnation of palm kernel fiber with phosphoric acid followed by microwave activation. Three different types of activated carbon were prepared, having high surface areas of 872 m2 g-1, 1256 m2 g-1, and 952 m2 g-1 and pore volumes of 0.598 cc g-1, 1.010 cc g-1, and 0.778 cc g-1, respectively. The combined effects of the different process parameters, such as the initial adsorbate concentration, pH, and temperature, on adsorption efficiency were explored with the help of Box-Behnken design for response surface methodology (RSM). The adsorption rate could be expressed by a polynomial equation as the function of the independent variables. The hexavalent chromium adsorption rate was found to be 19.1 mg g-1 at the optimized conditions of the process parameters, i.e., initial concentration of 60 mg L-1, pH of 3, and operating temperature of 50 oC. Adsorption of Cr(VI) by the prepared activated carbon was spontaneous and followed second-order kinetics. The adsorption mechanism can be described by the Freundlich Isotherm model. The prepared activated carbon has demonstrated comparable performance to other available activated carbons for the adsorption of Cr(VI).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technical market indicators are tools used by technical an- alysts to understand trends in trading markets. Technical (market) indicators are often calculated in real-time, as trading progresses. This paper presents a mathematically- founded framework for calculating technical indicators. Our framework consists of a domain specific language for the un- ambiguous specification of technical indicators, and a run- time system based on Click, for computing the indicators. We argue that our solution enhances the ease of program- ming due to aligning our domain-specific language to the mathematical description of technical indicators, and that it enables executing programs in kernel space for decreased latency, without exposing the system to users’ programming errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oyster® is a surface-piercing flap-type device designed to harvest wave energy in the nearshore environment. Established mathematical theories of wave energy conversion, such as 3D point-absorber and 2D terminator theory, are inadequate to accurately describe the behaviour of Oyster, historically resulting in distorted conclusions regarding the potential of such a concept to harness the power of ocean waves. Accurately reproducing the dynamics of Oyster requires the introduction of a new reference mathematical model, the “flap-type absorber”. A flap-type absorber is a large thin device which extracts energy by pitching about a horizontal axis parallel to the ocean bottom. This paper unravels the mathematics of Oyster as a flap-type absorber. The main goals of this work are to provide a simple–yet accurate–physical interpretation of the laws governing the mechanism of wave power absorption by Oyster and to emphasise why some other, more established, mathematical theories cannot be expected to accurately describe its behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In a Bayesian learning setting, the posterior distribution of a predictive model arises from a trade-off between its prior distribution and the conditional likelihood of observed data. Such distribution functions usually rely on additional hyperparameters which need to be tuned in order to achieve optimum predictive performance; this operation can be efficiently performed in an Empirical Bayes fashion by maximizing the posterior marginal likelihood of the observed data. Since the score function of this optimization problem is in general characterized by the presence of local optima, it is necessary to resort to global optimization strategies, which require a large number of function evaluations. Given that the evaluation is usually computationally intensive and badly scaled with respect to the dataset size, the maximum number of observations that can be treated simultaneously is quite limited. In this paper, we consider the case of hyperparameter tuning in Gaussian process regression. A straightforward implementation of the posterior log-likelihood for this model requires O(N^3) operations for every iteration of the optimization procedure, where N is the number of examples in the input dataset. We derive a novel set of identities that allow, after an initial overhead of O(N^3), the evaluation of the score function, as well as the Jacobian and Hessian matrices, in O(N) operations. We prove how the proposed identities, that follow from the eigendecomposition of the kernel matrix, yield a reduction of several orders of magnitude in the computation time for the hyperparameter optimization problem. Notably, the proposed solution provides computational advantages even with respect to state of the art approximations that rely on sparse kernel matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigates the effects of ground heterogeneity, considering permeability as a random variable, on an intruding SW wedge using Monte Carlo simulations. Random permeability fields were generated, using the method of Local Average Subdivision (LAS), based on a lognormal probability density function. The LAS method allows the creation of spatially correlated random fields, generated using coefficients of variation (COV) and horizontal and vertical scales of fluctuation (SOF). The numerical modelling code SUTRA was employed to solve the coupled flow and transport problem. The well-defined 2D dispersive Henry problem was used as the test case for the method. The intruding SW wedge is defined by two key parameters, the toe penetration length (TL) and the width of mixing zone (WMZ). These parameters were compared to the results of a homogeneous case simulated using effective permeability values. The simulation results revealed: (1) an increase in COV resulted in a seaward movement of TL; (2) the WMZ extended with increasing COV; (3) a general increase in horizontal and vertical SOF produced a seaward movement of TL, with the WMZ increasing slightly; (4) as the anisotropic ratio increased the TL intruded further inland and the WMZ reduced in size. The results show that for large values of COV, effective permeability parameters are inadequate at reproducing the effects of heterogeneity on SW intrusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Torrefaction based co-firing in a pulverized coal boiler has been proposed for large percentage of biomass co-firing. A 220 MWe pulverized coal-power plant is simulated using Aspen Plus for full understanding the impacts of an additional torrefaction unit on the efficiency of the whole power plant, the studied process includes biomass drying, biomass torrefaction, mill systems, biomass/coal devolatilization and combustion, heat exchanges and power generation. Palm kernel shells (PKS) were torrefied at same residence time but 4 different temperatures, to prepare 4 torrefied biomasses with different degrees of torrefaction. During biomass torrefaction processes, the mass loss properties and released gaseous components have been studied. In addition, process simulations at varying torrefaction degrees and biomass co-firing ratios have been carried out to understand the properties of CO2 emission and electricity efficiency in the studied torrefaction based co-firing power plant. According to the experimental results, the mole fractions of CO 2 and CO account for 69-91% and 4-27% in torrefied gases. The predicted results also showed that the electrical efficiency reduced when increasing either torrefaction temperature or substitution ratio of biomass. A deep torrefaction may not be recommended, because the power saved from biomass grinding is less than the heat consumed by the extra torrefaction process, depending on the heat sources. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A conjugate heat transfer (CHT) method was used to perform the aerothermal analysis of an internally cooled turbine vane, and was validated against experimental and empirical data.
Firstly, validation of the method with regard to internal cooling was done by reproducing heat transfer test data in a channel with pin fin heat augmenters, under steady constant wall temperature. The computed Nusselt numbers for the two tested configurations (full length circular pin fins attached to both walls and partial pin fins attached to one wall only) showed good agreement with the measurements. Sensitivity to mesh density was evaluated under this simplified case in order to establish mesh requirements for the analysis of the full component.
Secondly, the CHT method was applied onto a turbine vane test case from an actual engine. The predicted vane airfoil metal temperature was compared to the measured thermal paint data and the in-house empirical predictions. The CHT results agreed well with the thermal paint data and showed better prediction than the current empirical modeling approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many high-state non-magnetic cataclysmic variables (CVs) exhibit blueshifted absorption or P-Cygni profiles associated with ultraviolet (UV) resonance lines. These features imply the existence of powerful accretion disc winds in CVs. Here, we use our Monte Carlo ionization and radiative transfer code to investigate whether disc wind models that produce realistic UV line profiles are also likely to generate observationally significant recombination line and continuum emission in the optical waveband. We also test whether outflows may be responsible for the single-peaked emission line profiles often seen in high-state CVs and for the weakness of the Balmer absorption edge (relative to simple models of optically thick accretion discs). We find that a standard disc wind model that is successful in reproducing the UV spectra of CVs also leaves a noticeable imprint on the optical spectrum, particularly for systems viewed at high inclination. The strongest optical wind-formed recombination lines are H alpha and He ii lambda 4686. We demonstrate that a higher density outflow model produces all the expected H and He lines and produces a recombination continuum that can fill in the Balmer jump at high inclinations. This model displays reasonable verisimilitude with the optical spectrum of RW Trianguli. No single-peaked emission is seen, although we observe a narrowing of the double-peaked emission lines from the base of the wind. Finally, we show that even denser models can produce a single-peaked H alpha line. On the basis of our results, we suggest that winds can modify, and perhaps even dominate, the line and continuum emission from CVs.