954 resultados para software-defined network
Resumo:
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.
Resumo:
Many business-oriented software applications are subject to frequent changes in requirements. This paper shows that, ceteris paribus, increases in the volatility of system requirements decrease the reliability of software. Further, systems that exhibit high volatility during the development phase are likely to have lower reliability during their operational phase. In addition to the typically higher volatility of requirements, end-users who specify the requirements of business-oriented systems are usually less technically oriented than people who specify the requirements of compilers, radar tracking systems or medical equipment. Hence, the characteristics of software reliability problems for business-oriented systems are likely to differ significantly from those of more technically oriented systems.
Resumo:
The contemporary Vampire Subculture can be defined as a multi-faceted, socio-religious movement with its own distinct collective community and network of participants who share a similar belief system and customary lifestyle that reflect their concept of the vampire. The Vampire Subculture consists of individuals who profess to be 'real vampires', vampire communities of like-minded persons, 'blood-donors' who willingly allow vampires to partake of them, occult-based and mystical-orientated groups that appeal to their spirituality, the blood fetishists, and the live-action vampire role-players. In response, a Christian counter-movement of self-proclaimed 'vampire-slayers' has emerged that actively opposes the vampire subculture and its beliefs and practices. The socio-religious nature of the Vampire Subculture can be best described as a Segmented, Polycentric and Integrated Network of participants.
Resumo:
Existing refinement calculi provide frameworks for the stepwise development of imperative programs from specifications. This paper presents a refinement calculus for deriving logic programs. The calculus contains a wide-spectrum logic programming language, including executable constructs such as sequential conjunction, disjunction, and existential quantification, as well as specification constructs such as general predicates, assumptions and universal quantification. A declarative semantics is defined for this wide-spectrum language based on executions. Executions are partial functions from states to states, where a state is represented as a set of bindings. The semantics is used to define the meaning of programs and specifications, including parameters and recursion. To complete the calculus, a notion of correctness-preserving refinement over programs in the wide-spectrum language is defined and refinement laws for developing programs are introduced. The refinement calculus is illustrated using example derivations and prototype tool support is discussed.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
A central problem in visual perception concerns how humans perceive stable and uniform object colors despite variable lighting conditions (i.e. color constancy). One solution is to 'discount' variations in lighting across object surfaces by encoding color contrasts, and utilize this information to 'fill in' properties of the entire object surface. Implicit in this solution is the caveat that the color contrasts defining object boundaries must be distinguished from the spurious color fringes that occur naturally along luminance-defined edges in the retinal image (i.e. optical chromatic aberration). In the present paper, we propose that the neural machinery underlying color constancy is complemented by an 'error-correction' procedure which compensates for chromatic aberration, and suggest that error-correction may be linked functionally to the experimentally induced illusory colored aftereffects known as McCollough effects (MEs). To test these proposals, we develop a neural network model which incorporates many of the receptive-field (RF) profiles of neurons in primate color vision. The model is composed of two parallel processing streams which encode complementary sets of stimulus features: one stream encodes color contrasts to facilitate filling-in and color constancy; the other stream selectively encodes (spurious) color fringes at luminance boundaries, and learns to inhibit the filling-in of these colors within the first stream. Computer simulations of the model illustrate how complementary color-spatial interactions between error-correction and filling-in operations (a) facilitate color constancy, (b) reveal functional links between color constancy and the ME, and (c) reconcile previously reported anomalies in the local (edge) and global (spreading) properties of the ME. We discuss the broader implications of these findings by considering the complementary functional roles performed by RFs mediating color-spatial interactions in the primate visual system. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
We have developed a software application to enable interactive rehabilitation via the Internet. The reliability of the telemedicine application was examined by comparing it with face-to-face assessment. The physical outcome measures assessed were knee range of motion, quadriceps muscle strength, limb girth and an assessment of gait. One therapist performed both in-person and Internet-based measurements of all outcome measures on 20 normal subjects. There was good agreement between the two techniques (the 95% limits of agreement included zero for all the variables studied). Internet assessments were conducted at two bandwidths: ISDN at 128 kbit/s and the telephone network at 17 kbit/s. Bandwidth had no significant influence on any of the measures. This study suggests that Internet-based physiotherapy interventions delivered to the home are suitable for further development.
Resumo:
Background: There has been a proliferation of quality use of medicines activities in Australia since the 1990s. However, knowledge of the nature and extent of these activities was lacking. A mechanism was required to map the activities to enable their coordination. Aims: To develop a geographical mapping facility as an evaluative tool to assist the planning and implementation of Australia's policy on the quality use of medicines. Methods: A web-based database incorporating geographical mapping software was developed. Quality use of medicines projects implemented across the country was identified from project listings funded by the Quality Use of Medicines Evaluation Program, the National Health and Medical Research Council, Mental Health Strategy, Rural Health Support, Education and Training Program, the Healthy Seniors Initiative, the General Practice Evaluation Program and the Drug Utilisation Evaluation Network. In addition, projects were identified through direct mail to persons working in the field. Results: The Quality Use of Medicines Mapping Project (QUMMP) was developed, providing a Web-based database that can be continuously updated. This database showed the distribution of quality use of medicines activities by: (i) geographical region, (ii) project type, (iii) target group, (iv) stakeholder involvement, (v) funding body and (vi) evaluation method. At September 2001, the database included 901 projects. Sixty-two per cent of projects had been conducted in Australian capital cities, where approximately 63% of the population reside. Distribution of projects varied between States. In Western Australia and Queensland, 36 and 73 projects had been conducted, respectively, representing approximately two projects per 100 000 people. By comparison, in South Australia and Tasmania approximately seven projects per 100 000 people were recorded, with six per 100 000 people in Victoria and three per 100 000 people in New South Wales. Rural and remote areas of the country had more limited project activity. Conclusions: The mapping of projects by geographical location enabled easy identification of high and low activity areas. Analysis of the types of projects undertaken in each region enabled identification of target groups that had not been involved or services that had not yet been developed. This served as a powerful tool for policy planning and implementation and will be used to support the continued implementation of Australia's policy on the quality use of medicines.
Resumo:
A simple percolation theory-based method for determination of the pore network connectivity using liquid phase adsorption isotherm data combined with a density functional theory (DFT)-based pore size distribution is presented in this article. The liquid phase adsorption experiments have been performed using eight different esters as adsorbates and microporous-mesoporous activated carbons Filtrasorb-400, Norit ROW 0.8 and Norit ROX 0.8 as adsorbents. The density functional theory (DFT)-based pore size distributions of the carbons were obtained using DFT analysis of argon adsorption data. The mean micropore network coordination numbers, Z, of the carbons were determined based on DR characteristic plots and fitted saturation capacities using percolation theory. Based on this method, the critical molecular sizes of the model compounds used in this study were also obtained. The incorporation of percolation concepts in the prediction of multicomponent adsorption equilibria is also investigated, and found to improve the performance of the ideal adsorbed solution theory (IAST) model for the large molecules utilized in this study. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Seasonal climate forecasting offers potential for improving management of crop production risks in the cropping systems of NE Australia. But how is this capability best connected to management practice? Over the past decade, we have pursued participative systems approaches involving simulation-aided discussion with advisers and decision-makers. This has led to the development of discussion support software as a key vehicle for facilitating infusion of forecasting capability into practice. In this paper, we set out the basis of our approach, its implementation and preliminary evaluation. We outline the development of the discussion support software Whopper Cropper, which was designed for, and in close consultation with, public and private advisers. Whopper Cropper consists of a database of simulation output and a graphical user interface to generate analyses of risks associated with crop management options. The charts produced provide conversation pieces for advisers to use with their farmer clients in relation to the significant decisions they face. An example application, detail of the software development process and an initial survey of user needs are presented. We suggest that discussion support software is about moving beyond traditional notions of supply-driven decision support systems. Discussion support software is largely demand-driven and can compliment participatory action research programs by providing cost-effective general delivery of simulation-aided discussions about relevant management actions. The critical role of farm management advisers and dialogue among key players is highlighted. We argue that the discussion support concept, as exemplified by the software tool Whopper Cropper and the group processes surrounding it, provides an effective means to infuse innovations, like seasonal climate forecasting, into farming practice. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
The haploid NK model developed by Kauffman can be extended to diploid genomes and to incorporate gene-by-environment interaction effects in combination with epistasis. To provide the flexibility to include a wide range of forms of gene-by-environment interactions, a target population of environment types (TPE) is defined. The TPE consists of a set of E different environment types, each with their own frequency of occurrence. Each environment type conditions a different NK gene network structure or series of gene effects for a given network structure, providing the framework for defining gene-by-environment interactions. Thus, different NK models can be partially or completely nested within the E environment types of a TPE, giving rise to the E(NK) model for a biological system. With this model it is possible to examine how populations of genotypes evolve in context with properties of the environment that influence the contributions of genes to the fitness values of genotypes. We are using the E(NK) model to investigate how both epistasis and gene-by-environment interactions influence the genetic improvement of quantitative traits by plant breeding strategies applied to agricultural systems. © 2002 Wiley Periodicals, Inc.
Resumo:
We used a network of 20 carbon dioxide- and octenol-supplemented light traps to sample adult mosquitoes throughout Russell Island in southern Moreton Bay, south-east Queensland. Between February and April 2001, an estimated 1365 564 adult female mosquitoes were collected. In contrast to an average catch of 9754 female mosquitoes per trap night on Russell Island, reference traps set on Macleay Island and on the mainland returned average catches of 3172 and 222, respectively. On Russell Island, Ochlerotatus vigilax (Skuse), Coquillettidia linealis (Skuse), Culex annulirostris Skuse and Verrallina funerea (Theobald), known or suspected vectors of Ross River (RR) and/or Barmah Forest (BF) viruses, comprised 89.6% of the 25 taxa collected. When the spatial distributions of the above species were mapped and analysed using local spatial statistics, all were found to be present in highest numbers towards the southern end of the island during most of the 7 weeks. This indicated the presence of more suitable adult harbourage sites and/or suboptimal larval control efficacy. As immature stages and the breeding habitat of Cq. linealis are as yet undescribed, this species in particular presents a considerable impediment to proposed development scenarios. The method presented here of mapping the numbers of mosquitoes throughout a local government area allows specific areas that have high vector numbers to be defined.
Resumo:
This study describes the categorical classification of 155 individuals living in an endemic village in Macanip, Leyte, Philippines as 'resistant' or 'susceptible' to Schistosoma japonicum infection using available exposure, infection and reinfection data collected from a 3-year water contact (WC) study. Epidemiological parameters including age, sex, and infection intensities in relation to observed reinfection patterns are also described. This classification was used in subsequent immunological studies described in two accompanying papers to identify protective immune mechanisms among resistant individuals induced by defined candidate vaccine molecules for S. japonicum. The study suggests that individuals who were most vulnerable to rapid reinfection were children belonging to the 5-14 age group. A drop in incidence at age group 15-19 and decreased intensity of infection starting at this age group and older (15+) suggests development of immunity. Controlling for the effect of the other variables, a multivariate analysis showed significant association for sex, in that females were more likely to be resistant. This implies that other than acquired immunity to infection, some age-dependent host factors may also play an important role in the overall changes of reinfection patterns seen in schistosomiasis japonica in this population. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
Measuring perceptions of customers can be a major problem for marketers of tourism and travel services. Much of the problem is to determine which attributes carry most weight in the purchasing decision. Older travellers weigh many travel features before making their travel decisions. This paper presents a descriptive analysis of neural network methodology and provides a research technique that assesses the weighting of different attributes and uses an unsupervised neural network model to describe a consumer-product relationship. The development of this rich class of models was inspired by the neural architecture of the human brain. These models mathematically emulate the neurophysical structure and decision making of the human brain, and, from a statistical perspective, are closely related to generalised linear models. Artificial neural networks or neural networks are, however, nonlinear and do not require the same restrictive assumptions about the relationship between the independent variables and dependent variables. Using neural networks is one way to determine what trade-offs older travellers make as they decide their travel plans. The sample of this study is from a syndicated data source of 200 valid cases from Western Australia. From senior groups, active learner, relaxed family body, careful participants and elementary vacation were identified and discussed. (C) 2003 Published by Elsevier Science Ltd.