917 resultados para MODEL SEARCH
Resumo:
In this paper a modified Heffron-Phillip's (K-constant) model is derived for the design of power system stabilizers. A knowledge of external system parameters, such as equivalent infinite bus voltage and external impedances or their equivalent estimated values is required for designing a conventional power system stabilizer. In the proposed method, information available at the secondary bus of the step-up transformer is used to set up a modified Heffron-Phillip's (ModHP) model. The PSS design based on this model utilizes signals available within the generating station. The efficacy of the proposed design technique and the performance of the stabilizer has been evaluated over a range of operating and system conditions. The simulation results have shown that the performance of the proposed stabilizer is comparable to that could be obtained by conventional design but without the need for the estimation and computation of external system parameters. The proposed design is thus well suited for practical applications to power system stabilization, including possibly the multi-machine applications where accurate system information is not readily available.
Resumo:
A generalized Gierer-Meinhardt model has been used to account for the transplantation experiments in Hydra. In this model, a cross inhibition between the two organizing centres (namely, head and foot) are assumed to be the only mode of interaction in setting up a stable morphogen distribution for the pattern formation in Hydra.
Resumo:
Buffer zones are vegetated strip-edges of agricultural fields along watercourses. As linear habitats in agricultural ecosystems, buffer strips dominate and play a leading ecological role in many areas. This thesis focuses on the plant species diversity of the buffer zones in a Finnish agricultural landscape. The main objective of the present study is to identify the determinants of floral species diversity in arable buffer zones from local to regional levels. This study was conducted in a watershed area of a farmland landscape of southern Finland. The study area, Lepsämänjoki, is situated in the Nurmijärvi commune 30 km to the north of Helsinki, Finland. The biotope mosaics were mapped in GIS. A total of 59 buffer zones were surveyed, of which 29 buffer strips surveyed were also sampled by plot. Firstly, two diversity components (species richness and evenness) were investigated to determine whether the relationship between the two is equal and predictable. I found no correlation between species richness and evenness. The relationship between richness and evenness is unpredictable in a small-scale human-shaped ecosystem. Ordination and correlation analyses show that richness and evenness may result from different ecological processes, and thus should be considered separately. Species richness correlated negatively with phosphorus content, and species evenness correlated negatively with the ratio of organic carbon to total nitrogen in soil. The lack of a consistent pattern in the relationship between these two components may be due to site-specific variation in resource utilization by plant species. Within-habitat configuration (width, length, and area) were investigated to determine which is more effective for predicting species richness. More species per unit area increment could be obtained from widening the buffer strip than from lengthening it. The width of the strips is an effective determinant of plant species richness. The increase in species diversity with an increase in the width of buffer strips may be due to cross-sectional habitat gradients within the linear patches. This result can serve as a reference for policy makers, and has application value in agricultural management. In the framework of metacommunity theory, I found that both mass effect(connectivity) and species sorting (resource heterogeneity) were likely to explain species composition and diversity on a local and regional scale. The local and regional processes were interactively dominated by the degree to which dispersal perturbs local communities. In the lowly and intermediately connected regions, species sorting was of primary importance to explain species diversity, while the mass effect surpassed species sorting in the highly connected region. Increasing connectivity in communities containing high habitat heterogeneity can lead to the homogenization of local communities, and consequently, to lower regional diversity, while local species richness was unrelated to the habitat connectivity. Of all species found, Anthriscus sylvestris, Phalaris arundinacea, and Phleum pretense significantly responded to connectivity, and showed high abundance in the highly connected region. We suggest that these species may play a role in switching the force from local resources to regional connectivity shaping the community structure. On the landscape context level, the different responses of local species richness and evenness to landscape context were investigated. Seven landscape structural parameters served to indicate landscape context on five scales. On all scales but the smallest scales, the Shannon-Wiener diversity of land covers (H') correlated positively with the local richness. The factor (H') showed the highest correlation coefficients in species richness on the second largest scale. The edge density of arable field was the only predictor that correlated with species evenness on all scales, which showed the highest predictive power on the second smallest scale. The different predictive power of the factors on different scales showed a scaledependent relationship between the landscape context and local plant species diversity, and indicated that different ecological processes determine species richness and evenness. The local richness of species depends on a regional process on large scales, which may relate to the regional species pool, while species evenness depends on a fine- or coarse-grained farming system, which may relate to the patch quality of the habitats of field edges near the buffer strips. My results suggested some guidelines of species diversity conservation in the agricultural ecosystem. To maintain a high level of species diversity in the strips, a high level of phosphorus in strip soil should be avoided. Widening the strips is the most effective mean to improve species richness. Habitat connectivity is not always favorable to species diversity because increasing connectivity in communities containing high habitat heterogeneity can lead to the homogenization of local communities (beta diversity) and, consequently, to lower regional diversity. Overall, a synthesis of local and regional factors emerged as the model that best explain variations in plant species diversity. The studies also suggest that the effects of determinants on species diversity have a complex relationship with scale.
Resumo:
This research is a step forward in discovering knowledge from databases of complex structure like tree or graph. Several data mining algorithms are developed based on a novel representation called Balanced Optimal Search for extracting implicit, unknown and potentially useful information like patterns, similarities and various relationships from tree data, which are also proved to be advantageous in analysing big data. This thesis focuses on analysing unordered tree data, which is robust to data inconsistency, irregularity and swift information changes, hence, in the era of big data it becomes a popular and widely used data model.
Resumo:
The neutron-antineutron transition amplitude caused by an effective six fermion interaction with strength λeff is calculated within the context of the MIT Bag Model. The transition mass δm is found to have the value λeff×3×10−4(GeV6).
Resumo:
A kinetic model has been developed for the bulk polymerization of vinyl chloride using Talamini's hypothesis of two-phase polymerization and a new concept of kinetic solubility which assumes that rapidly growing polymer chains have considerably greater solubility than the thermodynamic solubility of preformed polymer molecules of the same size and so can remain in solution even under thermodynamically unfavourable conditions. It is further assumed that this kinetic solubility is a function of chain length. The model yields a rate expression consistent with the experimental data for vinyl chloride bulk polymerization and moreover is able to explain several characteristic kinetic features of this system. Application of the model rate expression to the available rate data has yielded 2.36 × 108l mol−1 sec−1 for the termination rate constant in the polymer-rich phase; as expected, this value is smaller than that reported for homogenous polymerization by a factor of 10–30.
Resumo:
Self-tuning is applied to the control of nonlinear systems represented by the Hammerstein model wherein the nonlinearity is any odd-order polynomial. But control costing is not feasible in general. Initial relay control is employed to contain the deviations.
Resumo:
A finite element model for the analysis of laminated composite cylindrical shells with through cracks is presented. The analysis takes into account anisotropic elastic behaviour, bending-extensional coupling and transverse shear deformation effects. The proposed finite element model is based on the approach of dividing a cracked configuration into triangular shaped singular elements around the crack tip with adjoining quadrilateral shaped regular elements. The parabolic isoparametric cylindrical shell elements (both singular and regular) used in this model employ independent displacement and rotation interpolation in the shell middle surface. The numerical comparisons show the evidence to the conclusion that the proposed model will yield accurate stress intensity factors from a relatively coarse mesh. Through the analysis of a pressurised fibre composite cylindrical shell with an axial crack, the effect of material orthotropy on the crack tip stress intensity factors is shown to be quite significant.
Resumo:
Advancements in the analysis techniques have led to a rapid accumulation of biological data in databases. Such data often are in the form of sequences of observations, examples including DNA sequences and amino acid sequences of proteins. The scale and quality of the data give promises of answering various biologically relevant questions in more detail than what has been possible before. For example, one may wish to identify areas in an amino acid sequence, which are important for the function of the corresponding protein, or investigate how characteristics on the level of DNA sequence affect the adaptation of a bacterial species to its environment. Many of the interesting questions are intimately associated with the understanding of the evolutionary relationships among the items under consideration. The aim of this work is to develop novel statistical models and computational techniques to meet with the challenge of deriving meaning from the increasing amounts of data. Our main concern is on modeling the evolutionary relationships based on the observed molecular data. We operate within a Bayesian statistical framework, which allows a probabilistic quantification of the uncertainties related to a particular solution. As the basis of our modeling approach we utilize a partition model, which is used to describe the structure of data by appropriately dividing the data items into clusters of related items. Generalizations and modifications of the partition model are developed and applied to various problems. Large-scale data sets provide also a computational challenge. The models used to describe the data must be realistic enough to capture the essential features of the current modeling task but, at the same time, simple enough to make it possible to carry out the inference in practice. The partition model fulfills these two requirements. The problem-specific features can be taken into account by modifying the prior probability distributions of the model parameters. The computational efficiency stems from the ability to integrate out the parameters of the partition model analytically, which enables the use of efficient stochastic search algorithms.
Resumo:
Segmentation is a data mining technique yielding simplified representations of sequences of ordered points. A sequence is divided into some number of homogeneous blocks, and all points within a segment are described by a single value. The focus in this thesis is on piecewise-constant segments, where the most likely description for each segment and the most likely segmentation into some number of blocks can be computed efficiently. Representing sequences as segmentations is useful in, e.g., storage and indexing tasks in sequence databases, and segmentation can be used as a tool in learning about the structure of a given sequence. The discussion in this thesis begins with basic questions related to segmentation analysis, such as choosing the number of segments, and evaluating the obtained segmentations. Standard model selection techniques are shown to perform well for the sequence segmentation task. Segmentation evaluation is proposed with respect to a known segmentation structure. Applying segmentation on certain features of a sequence is shown to yield segmentations that are significantly close to the known underlying structure. Two extensions to the basic segmentation framework are introduced: unimodal segmentation and basis segmentation. The former is concerned with segmentations where the segment descriptions first increase and then decrease, and the latter with the interplay between different dimensions and segments in the sequence. These problems are formally defined and algorithms for solving them are provided and analyzed. Practical applications for segmentation techniques include time series and data stream analysis, text analysis, and biological sequence analysis. In this thesis segmentation applications are demonstrated in analyzing genomic sequences.
Resumo:
A generalized isothermal effectiveness factor correlation has been proposed for catalytic reactions whose intrinsic kinetics are based on the redox model. In this correlation which is exact for asymptotic values of the Thiele parameter the effect of the parameters appearing in the model, the order of the reaction and particle geometry are incorporated in a modified form of Thiele parameter. The relationship takes the usual form: Image and predicts effectiveness factor with an error of less than 2% in a range of Thiele parameter that accommodates both the kinetic and diffusion control regimes.
Resumo:
A new automata model Mr,k, with a conceptually significant innovation in the form of multi-state alternatives at each instance, is proposed in this study. Computer simulations of the Mr,k, model in the context of feature selection in an unsupervised environment has demonstrated the superiority of the model over similar models without this multi-state-choice innovation.
Resumo:
The sugarcane transport system plays a critical role in the overall performance of Australia’s sugarcane industry. An inefficient sugarcane transport system interrupts the raw sugarcane harvesting process, delays the delivery of sugarcane to the mill, deteriorates the sugar quality, increases the usage of empty bins, and leads to the additional sugarcane production costs. Due to these negative effects, there is an urgent need for an efficient sugarcane transport schedule that should be developed by the rail schedulers. In this study, a multi-objective model using mixed integer programming (MIP) is developed to produce an industry-oriented scheduling optimiser for sugarcane rail transport system. The exact MIP solver (IBM ILOG-CPLEX) is applied to minimise the makespan and the total operating time as multi-objective functions. Moreover, the so-called Siding neighbourhood search (SNS) algorithm is developed and integrated with Sidings Satisfaction Priorities (SSP) and Rail Conflict Elimination (RCE) algorithms to solve the problem in a more efficient way. In implementation, the sugarcane transport system of Kalamia Sugar Mill that is a coastal locality about 1050 km northwest of Brisbane city is investigated as a real case study. Computational experiments indicate that high-quality solutions are obtainable in industry-scale applications.
Resumo:
A model for heterogeneous acetalisation of poly(vinyl alcohol) with limited solution volume is proposed based on the grain model of Sohn and Szekely. Instead of treating the heterogeneous acetalisation as purely a diffusion process, as in the Matuzawa and Ogasawara model, the present model also takes into account the chemical reaction and the physical state of the solid polymer, such as degree of swelling and porosity, and assumes segregation of the polymer phase at higher conversion into an outer fully reacted zone and an inner zone where the reaction still proceeds. The solution of the model for limited solution volume, moreover, offers a simple method of determining the kinetic parameters and diffusivity for the solid-liquid system using the easily measurable bulk solution concentration of the liquid reactant instead of conversion-distance data for the solid phase, which are considerably more difficult to obtain.
Resumo:
In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.