140 resultados para deterministic fractals


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Enterprise Resource Planning (ERP) software typically takes the form of a package that is licensed for use to those in a client organisation and is sold as being able to automate a wide range of processes within organisations. ERP packages have become an important feature of information and communications technology (ICT) infrastructures in organizations. However, a number of highly publicised failures have been associated with the ERP packages too. For example: Hershey, Aero Group and Snap-On have blamed the implementation of ERP packages for negative impacts upon earnings (Scott and Vessey 2000); Cadbury Schweppes implemented plans to fulfil 250 orders where normally they would fulfil 1000 due to the increased complexity and the need to re-train staff post implementation (August 1999) and FoxMeyer drug company’s implementation of an ERP package has been argued to have lead to bankruptcy proceedings resulting in litigation against SAP, the software vendor in question (Bicknell 1998). Some have even rejected a single vendor approach outright (Light et. al. 2001). ERP packages appear to work for some and not for others, they contain contradictions. Indeed, if we start from the position that technologies do not provide their own explanation, then we have to consider the direction of a technological trajectory and why it moves in one way rather than another (Bijker and Law 1994). In other words, ERP appropriation cannot be predetermined as a success, despite the persuasive attempts of vendors via their websites and other marketing channels. Moreover, just because ERP exists, we cannot presume that all will appropriate it in the same fashion, if at all. There is more to the diffusion of innovations than stages of adoption and a simple demarcation between adoption and rejection. The processes that are enacted in appropriation need to be conceptualised as a site of struggle, political and imbued with power (Hislop et. al. 2000; Howcroft and Light, 2006). ERP appropriation and rejection can therefore be seen as a paradoxical phenomenon. In this paper we examine these contradictions as a way to shed light on the presence and role of inconsistencies in ERP appropriation and rejection. We argue that much of the reasoning associated with ERP adoption is pro-innovation biased and that deterministic models of the diffusion of innovations such as Rogers (2003), do not adequately take account of contradictions in the process. Our argument is that a better theoretical understanding of these contradictions is necessary to underpin research and practice in this area. In the next section, we introduce our view of appropriation. Following this is an outline of the idea of contradiction, and the strategies employed to ‘cope’ with this. Then, we introduce a number of reasons for ERP adoption and identify their inherent contradictions using these perspectives. From this discussion, we draw a framework, which illustrates how the interpretive flexibility of reasons to adopt ERP packages leads to contradictions which fuel the enactment of appropriation and rejection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Discussion is currently taking place among international HIV/AIDS groups around increasing HIV testing and initiating earlier use of antiretroviral therapy (ART) among people diagnosed with HIV as a method to reduce the spread of HIV. In this study, we explore the expected epidemiological impact of this strategy in a small population in which HIV transmission is predominantly confined to men who have sex with men (MSM). Methods: A deterministic mathematical transmission model was constructed to investigate the impacts of strategies that increase testing and treatment rates, and their likely potential to mitigate HIV epidemics among MSM. Our novel model distinguishes men in the population who are more easily accessible to prevention campaigns through engagement with the gay community from men who are not. This model is applied to the population of MSM in South Australia. Results: Our model-based findings suggest that increasing testing rates alone will have minimal impact on reducing the expected number of infections compared to current conditions. However, in combination with increases in treatment coverage, this strategy could lead to a 59–68% reduction in the number of HIV infections over the next 5 years. Targeting men who are socially engaged with the gay community would result in the majority of potential reductions in incidence, with only minor improvements possible by reaching all other MSM. Conclusions: Investing in strategies that will achieve higher coverage and earlier initiation of treatment to reduce infectiousness of HIV-infected individuals could be an effective strategy for reducing incidence in a population of MSM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past few years, there has been a steady increase in the attention, importance and focus of green initiatives related to data centers. While various energy aware measures have been developed for data centers, the requirement of improving the performance efficiency of application assignment at the same time has yet to be fulfilled. For instance, many energy aware measures applied to data centers maintain a trade-off between energy consumption and Quality of Service (QoS). To address this problem, this paper presents a novel concept of profiling to facilitate offline optimization for a deterministic application assignment to virtual machines. Then, a profile-based model is established for obtaining near-optimal allocations of applications to virtual machines with consideration of three major objectives: energy cost, CPU utilization efficiency and application completion time. From this model, a profile-based and scalable matching algorithm is developed to solve the profile-based model. The assignment efficiency of our algorithm is then compared with that of the Hungarian algorithm, which does not scale well though giving the optimal solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper offers an uncertainty quantification (UQ) study applied to the performance analysis of the ERCOFTAC conical diffuser. A deterministic CFD solver is coupled with a non-statistical generalised Polynomial Chaos(gPC)representation based on a pseudo-spectral projection method. Such approach has the advantage to not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic field. The stochactic results highlihgt the importance of the inlet velocity uncertainties on the pressure recovery both alone and when coupled with a second uncertain variable. From a theoretical point of view, we investigate the possibility to build our gPC representation on arbitray grid, thus increasing the flexibility of the stochastic framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to its ability to represent intricate systems with material nonlinearities as well as irregular loading, boundary, geometrical and material domains, the finite element (FE) method has been recognized as an important computational tool in spinal biomechanics. Current FE models generally account for a single distinct spinal geometry with one set of material properties despite inherently large inter-subject variability. The uncertainty and high variability in tissue material properties, geometry, loading and boundary conditions has cast doubt on the reliability of their predictions and comparability with reported in vitro and in vivo values. A multicenter study was undertaken to compare the results of eight well-established models of the lumbar spine that have been developed, validated and applied for many years. Models were subjected to pure and combined loading modes and their predictions were compared to in vitro and in vivo measurements for intervertebral rotations, disc pressures and facet joint forces. Under pure moment loading, the predicted L1-5 rotations of almost all models fell within the reported in vitro ranges; their median values differed on average by only 2° for flexion-extension, 1° for lateral bending and 5° for axial rotation. Predicted median facet joint forces and disc pressures were also in good agreement with previously published median in vitro values. However, the ranges of predictions were larger and exceeded the in vitro ranges, especially for facet joint forces. For all combined loading modes, except for flexion, predicted median segmental intervertebral rotations and disc pressures were in good agreement with in vivo values. The simulations yielded median facet joint forces of 0 N in flexion, 38 N in extension, 14 N in lateral bending and 60 N in axial rotation that could not be validated due to the paucity of in vivo facet joint forces. In light of high inter-subject variability, one must be cautious when generalizing predictions obtained from one deterministic model. This study demonstrates however that the predictive power increases when FE models are combined together. The median of individual numerical results can hence be used as an improved tool in order to estimate the response of the lumbar spine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show the first deterministic construction of an unconditionally secure multiparty computation (MPC) protocol in the passive adversarial model over black-box non-Abelian groups which is both optimal (secure against an adversary who possesses any tdeterministic constructions from Desmedt et al. (2012) our construction has subexponential complexity and is optimal at the same time, i.e., it is secure for any t

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider online prediction problems where the loss between the prediction and the outcome is measured by the squared Euclidean distance and its generalization, the squared Mahalanobis distance. We derive the minimax solutions for the case where the prediction and action spaces are the simplex (this setup is sometimes called the Brier game) and the \ell_2 ball (this setup is related to Gaussian density estimation). We show that in both cases the value of each sub-game is a quadratic function of a simple statistic of the state, with coefficients that can be efficiently computed using an explicit recurrence relation. The resulting deterministic minimax strategy and randomized maximin strategy are linear functions of the statistic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As connectivity analyses become more popular, claims are often made about how the brain's anatomical networks depend on age, sex, or disease. It is unclear how results depend on tractography methods used to compute fiber networks. We applied 11 tractography methods to high angular resolution diffusion images of the brain (4-Tesla 105-gradient HARDI) from 536 healthy young adults. We parcellated 70 cortical regions, yielding 70×70 connectivity matrices, encoding fiber density. We computed popular graph theory metrics, including network efficiency, and characteristic path lengths. Both metrics were robust to the number of spherical harmonics used to model diffusion (4th-8th order). Age effects were detected only for networks computed with the probabilistic Hough transform method, which excludes smaller fibers. Sex and total brain volume affected networks measured with deterministic, tensor-based fiber tracking but not with the Hough method. Each tractography method includes different fibers, which affects inferences made about the reconstructed networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Biochemical systems with relatively low numbers of components must be simulated stochastically in order to capture their inherent noise. Although there has recently been considerable work on discrete stochastic solvers, there is still a need for numerical methods that are both fast and accurate. The Bulirsch-Stoer method is an established method for solving ordinary differential equations that possesses both of these qualities. Results In this paper, we present the Stochastic Bulirsch-Stoer method, a new numerical method for simulating discrete chemical reaction systems, inspired by its deterministic counterpart. It is able to achieve an excellent efficiency due to the fact that it is based on an approach with high deterministic order, allowing for larger stepsizes and leading to fast simulations. We compare it to the Euler τ-leap, as well as two more recent τ-leap methods, on a number of example problems, and find that as well as being very accurate, our method is the most robust, in terms of efficiency, of all the methods considered in this paper. The problems it is most suited for are those with increased populations that would be too slow to simulate using Gillespie’s stochastic simulation algorithm. For such problems, it is likely to achieve higher weak order in the moments. Conclusions The Stochastic Bulirsch-Stoer method is a novel stochastic solver that can be used for fast and accurate simulations. Crucially, compared to other similar methods, it better retains its high accuracy when the timesteps are increased. Thus the Stochastic Bulirsch-Stoer method is both computationally efficient and robust. These are key properties for any stochastic numerical method, as they must typically run many thousands of simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since we still know very little about stem cells in their natural environment, it is useful to explore their dynamics through modelling and simulation, as well as experimentally. Most models of stem cell systems are based on deterministic differential equations that ignore the natural heterogeneity of stem cell populations. This is not appropriate at the level of individual cells and niches, when randomness is more likely to affect dynamics. In this paper, we introduce a fast stochastic method for simulating a metapopulation of stem cell niche lineages, that is, many sub-populations that together form a heterogeneous metapopulation, over time. By selecting the common limiting timestep, our method ensures that the entire metapopulation is simulated synchronously. This is important, as it allows us to introduce interactions between separate niche lineages, which would otherwise be impossible. We expand our method to enable the coupling of many lineages into niche groups, where differentiated cells are pooled within each niche group. Using this method, we explore the dynamics of the haematopoietic system from a demand control system perspective. We find that coupling together niche lineages allows the organism to regulate blood cell numbers as closely as possible to the homeostatic optimum. Furthermore, coupled lineages respond better than uncoupled ones to random perturbations, here the loss of some myeloid cells. This could imply that it is advantageous for an organism to connect together its niche lineages into groups. Our results suggest that a potential fruitful empirical direction will be to understand how stem cell descendants communicate with the niche and how cancer may arise as a result of a failure of such communication.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an algorithm for multiarmed bandits that achieves almost optimal performance in both stochastic and adversarial regimes without prior knowledge about the nature of the environment. Our algorithm is based on augmentation of the EXP3 algorithm with a new control lever in the form of exploration parameters that are tailored individually for each arm. The algorithm simultaneously applies the “old” control lever, the learning rate, to control the regret in the adversarial regime and the new control lever to detect and exploit gaps between the arm losses. This secures problem-dependent “logarithmic” regret when gaps are present without compromising on the worst-case performance guarantee in the adversarial regime. We show that the algorithm can exploit both the usual expected gaps between the arm losses in the stochastic regime and deterministic gaps between the arm losses in the adversarial regime. The algorithm retains “logarithmic” regret guarantee in the stochastic regime even when some observations are contaminated by an adversary, as long as on average the contamination does not reduce the gap by more than a half. Our results for the stochastic regime are supported by experimental validation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the terminating concept of BKZ reduction first introduced by Hanrot et al. [Crypto'11] and make extensive experiments to predict the number of tours necessary to obtain the best possible trade off between reduction time and quality. Then, we improve Buchmann and Lindner's result [Indocrypt'09] to find sub-lattice collision in SWIFFT. We illustrate that further improvement in time is possible through special setting of SWIFFT parameters and also through the combination of different reduction parameters adaptively. Our contribution also include a probabilistic simulation approach top-up deterministic simulation described by Chen and Nguyen [Asiacrypt'11] that can able to predict the Gram-Schmidt norms more accurately for large block sizes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large integration of solar Photo Voltaic (PV) in distribution network has resulted in over-voltage problems. Several control techniques are developed to address over-voltage problem using Deterministic Load Flow (DLF). However, intermittent characteristics of PV generation require Probabilistic Load Flow (PLF) to introduce variability in analysis that is ignored in DLF. The traditional PLF techniques are not suitable for distribution systems and suffer from several drawbacks such as computational burden (Monte Carlo, Conventional convolution), sensitive accuracy with the complexity of system (point estimation method), requirement of necessary linearization (multi-linear simulation) and convergence problem (Gram–Charlier expansion, Cornish Fisher expansion). In this research, Latin Hypercube Sampling with Cholesky Decomposition (LHS-CD) is used to quantify the over-voltage issues with and without the voltage control algorithm in the distribution network with active generation. LHS technique is verified with a test network and real system from an Australian distribution network service provider. Accuracy and computational burden of simulated results are also compared with Monte Carlo simulations.