84 resultados para deterministic fractals


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electronic document editor software is traditionally complex to design and difficult to implement. This research resulted in REDDL, a language for the specification of electronic document editors at a simplified declarative level, employing a deterministic storage model. This approach allows rapid and simplified development of this class of software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Constraint based tools for architectural design exploration need to satisfy aesthetic and functional criteria as well as combine discrete and continuous modes of exploration. In this paper, we examine the possibilities for stochastic processes in design space exploration.

Specifically, we address the application of a stochastic wind motion model to the subdivision of an external building envelope into smaller discrete components. Instead of deterministic subdivision constraints, we introduce explicit uncertainty into the system of subdivision. To address these aims, we develop a model of stochastic wind motion; create a subdivision scheme that is governed by the wind model and explore a design space of a facade subdivision problem. A discrete version of the facade, composed of light strips and panels, based on the bamboo elements deformed by continuous wind motion, is developed. The results of the experiments are presented in the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, we develop some deterministic metamodels to quickly and precisely predict the future of a technically complex system. The underlying system is essentially a stochastic, discrete event simulation model of a big baggage handling system. The highly detailed simulation model of this is used for conducting some experiments and logging data which are then used for training artificial neural network metamodels. Demonstrated results show that the developed metamodels are well able to predict different performance measures related to the travel time of bags within this system. In contrast to the simulation models which are computationally expensive and expertise extensive to be developed, run, and maintained, the artificial neural network metamodels could serve as real time decision aiding tools which are considerably fast, precise, simple to use, and reliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Successfully determining competitive optimal schedules for electricity generation intimately hinges on the forecasts of loads. The nonstationarity and high volatility of loads make their accurate prediction somewhat problematic. Presence of uncertainty in data also significantly degrades accuracy of point predictions produced by deterministic load forecasting models. Therefore, operation planning utilizing these predictions will be unreliable. This paper aims at developing prediction intervals rather than producing exact point prediction. Prediction intervals are theatrically more reliable and practical than predicted values. The delta and Bayesian techniques for constructing prediction intervals for forecasted loads are implemented here. To objectively and comprehensively assess quality of constructed prediction intervals, a new index based on length and coverage probability of prediction intervals is developed. In experiments with real data, and through calculation of global statistics, it is shown that neural network point prediction performance is unreliable. In contrast, prediction intervals developed using the delta and Bayesian techniques are satisfactorily narrow, with a high coverage probability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Propagation of Peer-to-Peer (P2P) worms in the Internet is posing a serious challenge to network security research because of P2P worms' increasing complexity and sophistication. Due to the complexity of the problem, no existing work has solved the problem of modeling the propagation of P2P worms, especially when quarantine of peers is enforced. This paper presents a study on modeling the propagation of P2P worms. It also presents our applications of the proposed approach in worm propagation research.

Motivated by our aspiration to invent an easy-to-employ instrument for worm propagation research, the proposed approach models the propagation processes of P2P worms by difference equations of a logic matrix, which are essentially discrete-time deterministic propagation models of P2P worms. To the best of our knowledge, we are the first using a logic matrix in network security research in general and worm propagation modeling in particular.

Our major contributions in this paper are firstly, we propose a novel logic matrix approach to modeling the propagation of P2P worms under three different conditions; secondly, we find the impacts of two different topologies on a P2P worm's attack performance; thirdly, we find the impacts of the network-related characteristics on a P2P worm's attack performance in structured P2P networks; and fourthly, we find the impacts of the two different quarantine tactics on the propagation characteristics of P2P worms in unstructured P2P networks. The approach's ease of employment, which is demonstrated by its applications in our simulation experiments, makes it an attractive instrument to conduct worm propagation research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Recursive Auto-Associative Memory (RAAM) has come to dominate connectionist investigations into representing compositional structure. Although an adequate model when dealing with limited data, the capacity of RAAM to scale-up to real-world tasks has been frequently questioned. RAAM networks are difficult to train (due to the moving target effect) and as such training times can be lengthy. Investigations into RAAM have produced many variants in an attempt to overcome such limitations. We outline how one such model ((S)RAAM) is able to quickly produce context-sensitive representations that may be used to aid a deterministic parsing process. By substituting a symbolic stack in an existing hybrid parser, we show that (S)RAAM is more than capable of encoding the real-world data sets employed. We conclude by suggesting that models such as (S)RAAM offer valuable insights into the features of connectionist compositional representations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This brief deals with the problem of minor component analysis (MCA). Artificial neural networks can be exploited to achieve the task of MCA. Recent research works show that convergence of neural networks based MCA algorithms can be guaranteed if the learning rates are less than certain thresholds. However, the computation of these thresholds needs information about the eigenvalues of the autocorrelation matrix of data set, which is unavailable in online extraction of minor component from input data stream. In this correspondence, we introduce an adaptive learning rate into the OJAn MCA algorithm, such that its convergence condition does not depend on any unobtainable information, and can be easily satisfied in practical applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Random fluctuations of the electrical quantities (electrode potential and cell current) in electrochemical systems commonly are referred to as electrochemical noise (ECN). The ECN signal for the corrosion of mild steel in reinforced concrete specimen was analyzed with the Continuous Wavelet Transform (CWT). The original signal was transformed into a time-frequency phase plane with colors representing the coefficients of the CWT. The signal shows a self-similarity structure in the phase plane. Through this way, the chaotic nature of corrosion process is manifested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A grid computing system consists of a group of programs and resources that are spread across machines in the grid. A grid system has a dynamic environment and decentralized distributed resources, so it is important to provide efficient scheduling for applications. Task scheduling is an NP-hard problem and deterministic algorithms are inadequate and heuristic algorithms such as particle swarm optimization (PSO) are needed to solve the problem. PSO is a simple parallel algorithm that can be applied in different ways to resolve optimization problems. PSO searches the problem space globally and needs to be combined with other methods to search locally as well. In this paper, we propose a hybrid-scheduling algorithm to solve the independent task- scheduling problem in grid computing. We have combined PSO with the gravitational emulation local search (GELS) algorithm to form a new method, PSO–GELS. Our experimental results demonstrate the effectiveness of PSO–GELS compared to other algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim  To examine the exploitation, recovery and current status of green turtles (Chelonia mydas) nesting at Ascension Island. Location  Ascension Island (UK) (7°57′ S, 14°22′ W), South Atlantic Ocean. Methods  We analysed records of the harvest of green turtles nesting at Ascension Island between 1822 and 1935, illustrating the decline in numbers over this period. Using a deterministic age-class structured model we predict the initial number of breeding females present in the population prior to the recorded harvest and compare this to our estimate of the current population based upon our recent annual surveys (1999–2004). Results  Prior to 1822 we estimate the nesting population of green turtles to have been at least 19,000–22,000 individuals in order for the population to have survived the level of harvest recorded. From recent data (1999–2004), we estimate the current breeding population of green turtles at this site to be 11,000–15,000 females. Our results illustrate a dramatic recovery of the population, which is still increasing exponentially and shows no evidence of slowing, suggesting it has not reached 50% of its carrying capacity. Main conclusions  We estimate that, since the 1970s, the Ascension Island population of green turtles has increased by 285% and question the recent listing of this species as endangered by the IUCN (World Conservation Union), in particular in the Atlantic Ocean, where 75% of the populations assessed by the IUCN are increasing. Indeed, we estimate the global population of this species to be in excess of 2.2 million individuals. We suggest that the IUCN's global listing process detracts attention from those populations that are truly threatened with extinction and should not, in its present form, be applied to globally distributed long-lived species such as marine turtles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article we describe how concepts of risk are both generated by and used to reinforce a neoliberal agenda in relation to the health and well-being of young people. We examine how risk may be used as a tool to advance ideals such as rational choice and individual responsibility, and how this can further disadvantage young people living within the contexts of structural disadvantage (such as geographic areas of long-term unemployment; communities that experience racial discrimination). We also identify the ways in which risk is applied in uneven ways within structurally disadvantaged contexts. To suggest a way forward, we articulate a set of principles and strategies that offer up a means of resisting neoliberal imperatives and suggest how these might play out at the micro-, meso- and macro-levels. To do this, we discuss examples from the UK, Canadian and Australian contexts to illustrate how young people resist being labelled as risky, and how it is possible to engage in health equity-enhancing actions, despite seemingly deterministic forces. The cases we describe reveal some of the vulnerabilities (and hence opportunities) within the seemingly impenetrable world view and powers of neoliberals, and point towards the potential to formulate an agenda of resistance and new directions for young people's health promotion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper applies dimensional analysis to propose an alternative model for estimating the effective density of flocs (Δρf). The model takes into account the effective density of the primary particles, in addition to the sizes of the floc and primary particles, and does not consider the concept of self-similarity. The model contains three dimensionless products and two empirical parameters (αf and βf), which were calibrated by using data available in the literature. Values of αf=0.7 and βf=0.8 were obtained. The average value of the primary particle size (Dp) for the data used in the analysis, inferred from the new model, was found to vary from 0.05 μm to 100 μm with a mean value of 2.5 μm. Good comparisons were obtained in comparing the estimated floc-settling velocity on the basis of the proposed model for effective floc density with the measured value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a summary of the evidence review group (ERG) report into the clinical and cost-effectiveness of entecavir for the treatment of chronic hepatitis B (CHB) in adults based upon a review of the manufacturer's submission to the National Institute for Health and Clinical Excellence (NICE) as part of the single technology appraisal (STA) process. The submission's evidence came from five randomised controlled trials (RCTs), of good methodological quality and measuring a range of clinically relevant outcomes, comparing entecavir with lamivudine. After 1 year of treatment entecavir was statistically superior to lamivudine in terms of the proportion of patients achieving hepatitis B virus (HBV) DNA suppression, alanine aminotransferase (ALT) normalisation and histological improvement, but not in terms of the proportion of patients achieving hepatitis B e antigen (HBeAg) seroconversion. The incidence of adverse or serious adverse events was similar for both treatments. The results of the manufacturer's mixed treatment comparison (MTC) model to compare entecavir with the comparator drugs in nucleoside-naive patients were considered to be uncertain because of concerns over its conduct and reporting. For the economic evaluation the manufacturer constructed two Markov state transition models, one in HBeAg-positive and one in HBeAg-negative patients. The modelling approach was considered reasonable subject to some uncertainties and concerns over some of the structural assumptions. In HBeAg-positive patients the base-case incremental cost-effectiveness ratios (ICER) for entecavir compared with lamivudine and pegylated interferon alpha-2a were 14,329 pounds and 8403 pounds per quality-adjusted life-year (QALY) respectively. Entecavir was dominated by telbivudine. In HBeAg-negative patients the base-case ICERs for entecavir compared with lamivudine, pegylated interferon alpha-2a and telbivudine were 13,208 pounds, 7511 pounds and 6907 pounds per QALY respectively. In HBeAg-positive lamivudine-refractory patients entecavir dominated adefovir added to lamivudine. In one-way deterministic sensitivity analysis on all key input parameters for entecavir compared with lamivudine in nucleoside-naive patients, ICERs generally remained under 30,000 pounds per QALY. In probabilistic sensitivity analysis in nucleoside-naive HBeAg-positive patients the probability of the ICER for entecavir being below 20,000 pounds per QALY was 57%, 82% and 45% compared with lamivudine, pegylated interferon alpha-2a and telbivudine respectively. In nucleoside-naive HBeAg-negative patients the probabilities were 90%, 100% and 96% respectively. The manufacturer's lifetime treatment scenario for HBeAg-negative patients and the ERG's 20-year treatment scenario for HBeAg-positive patients increased the ICERs, particularly in the latter case. Amending the HBeAg-negative model so that patients with compensated cirrhosis would also receive lifetime treatment gave probabilities of entecavir being cost-effective at a willingness to pay of 20,000 pounds and 30,000 pounds of 4% and 40% respectively. The NICE guidance issued in August 2008 as a result of the STA states that entecavir is recommended as an option for the treatment of people with chronic HBeAg-positive or HBeAg-negative hepatitis B in whom antiviral treatment is indicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social scientists and Indigenous people have voiced concerns that media messages about genetics and race may increase the public's belief in genetic determinism and even increase levels of racism. The degree of genetic determinism in media messages has been examined as a determining factor. This study is the first to consider the implications of this area of scholarship for the indigenous minority in Australia. A search of the last two decades of major Australian newspapers was undertaken for articles that discussed Indigenous Australians and genetics. The review found 212 articles, of which 58 concerned traits or conditions that were presented in a genetically deterministic or antideterministic fashion. These 58 articles were analysed by topic, slant, and time period. Overall, 23 articles were anti-deterministic, 18 were deterministic, 14 presented both sides and three were ambiguous. There was a spike in anti-deterministic articles in the years after the Human Genome Diversity Project, and a parallel increase in deterministic articles since the completion of the Human Genome Project in 2000. Potential implications of the nature of media coverage of genetics for Indigenous Australians is discussed. Further research is required to test directly the impact of these messages on Australians.