56 resultados para Two Approaches


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The growing accessibility to genomic resources using next-generation sequencing (NGS) technologies has revolutionized the application of molecular genetic tools to ecology and evolutionary studies in non-model organisms. Here we present the case study of the European hake (Merluccius merluccius), one of the most important demersal resources of European fisheries. Two sequencing platforms, the Roche 454 FLX (454) and the Illumina Genome Analyzer (GAII), were used for Single Nucleotide Polymorphisms (SNPs) discovery in the hake muscle transcriptome. De novo transcriptome assembly into unique contigs, annotation, and in silico SNP detection were carried out in parallel for 454 and GAII sequence data. High-throughput genotyping using the Illumina GoldenGate assay was performed for validating 1,536 putative SNPs. Validation results were analysed to compare the performances of 454 and GAII methods and to evaluate the role of several variables (e.g. sequencing depth, intron-exon structure, sequence quality and annotation). Despite well-known differences in sequence length and throughput, the two approaches showed similar assay conversion rates (approximately 43%) and percentages of polymorphic loci (67.5% and 63.3% for GAII and 454, respectively). Both NGS platforms therefore demonstrated to be suitable for large scale identification of SNPs in transcribed regions of non-model species, although the lack of a reference genome profoundly affects the genotyping success rate. The overall efficiency, however, can be improved using strict quality and filtering criteria for SNP selection (sequence quality, intron-exon structure, target region score).

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper, we consider the variable selection problem for a nonlinear non-parametric system. Two approaches are proposed, one top-down approach and one bottom-up approach. The top-down algorithm selects a variable by detecting if the corresponding partial derivative is zero or not at the point of interest. The algorithm is shown to have not only the parameter but also the set convergence. This is critical because the variable selection problem is binary, a variable is either selected or not selected. The bottom-up approach is based on the forward/backward stepwise selection which is designed to work if the data length is limited. Both approaches determine the most important variables locally and allow the unknown non-parametric nonlinear system to have different local dimensions at different points of interest. Further, two potential applications along with numerical simulations are provided to illustrate the usefulness of the proposed algorithms.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We formally compare fundamental factor and latent factor approaches to oil price modelling. Fundamental modelling has a long history in seeking to understand oil price movements, while latent factor modelling has a more recent and limited history, but has gained popularity in other financial markets. The two approaches, though competing, have not formally been compared as to effectiveness. For a range of short- medium- and long-dated WTI oil futures we test a recently proposed five-factor fundamental model and a Principal Component Analysis latent factor model. Our findings demonstrate that there is no discernible difference between the two techniques in a dynamic setting. We conclude that this infers some advantages in adopting the latent factor approach due to the difficulty in determining a well specified fundamental model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigate the ability of the local density approximation (LDA) in density functional theory to predict the near-edge structure in electron energy-loss spectroscopy in the dipole approximation. We include screening of the core hole within the LDA using Slater's transition state theory. We find that anion K-edge threshold energies are systematically overestimated by 4.22 +/- 0.44 eV in twelve transition metal carbides and nitrides in the rock-salt (B1) structure. When we apply this 'universal' many-electron correction to energy-loss spectra calculated within the transition state approximation to LDA, we find quantitative agreement with experiment to within one or two eV for TiC, TiN and VN. We compare our calculations to a simpler approach using a projected Mulliken density which honours the dipole selection rule, in place of the dipole matrix element itself. We find remarkably close agreement between these two approaches. Finally, we show an anomaly in the near-edge structure in CrN to be due to magnetic structure. In particular, we find that the N K edge in fact probes the magnetic moments and alignments of ther sublattice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Incidence calculus is a mechanism for probabilistic reasoning in which sets of possible worlds, called incidences, are associated with axioms, and probabilities are then associated with these sets. Inference rules are used to deduce bounds on the incidence of formulae which are not axioms, and bounds for the probability of such a formula can then be obtained. In practice an assignment of probabilities directly to axioms may be given, and it is then necessary to find an assignment of incidence which will reproduce these probabilities. We show that this task of assigning incidences can be viewed as a tree searching problem, and two techniques for performing this research are discussed. One of these is a new proposal involving a depth first search, while the other incorporates a random element. A Prolog implementation of these methods has been developed. The two approaches are compared for efficiency and the significance of their results are discussed. Finally we discuss a new proposal for applying techniques from linear programming to incidence calculus.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A 2D isothermal finite element simulation of the injection stretch-blow molding (ISBM) process for polyethylene terephthalate (PET) containers has been developed through the commercial finite element package ABAQUS/standard. In this work, the blowing air to inflate the PET preform was modeled through two different approaches: a direct pressure input (as measured in the blowing machine) and a constant mass flow rate input (based on a pressure-volume-time relationship). The results from these two approaches were validated against free blow and free stretch-blow experiments, which were instrumented and monitored through high-speed video. Results show that simulation using a constant mass flow rate approach gave a better prediction of volume vs. time curve and preform shape evolution when compared with the direct pressure approach and hence is more appropriate in modeling the preblowing stage in the injection stretch-blow molding process

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A research project in Web-enabled collaborative design and manufacture has been conducted. The major tasks of the project include the development of a Web-enabled environment for collaboration, online collaborative CAD/CAM, remote execution of large size programs (RELSP), and distributed product design. The tasks and Web/Internet techniques involved are presented first, followed by detail description of two approaches developed for implementation of the research: (1) a client-server approach for RELSP, where the following Internet techniques are utilized: CORBA, Microsoft’s Internet information server, Tomcat server, JDBC and ODBC; (2) Web-Services supported collaborative CAD which enables geographically dispersed designers jointly conduct a design task in the way of speaking and seeing each other and instantaneously modifying the CAD drawing online.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The flowfield around a supersonic projectile using a pin actuator control method has been predicted using computational fluid dynamics. It has been predicted using both viscous and inviscid methods for a number of positions. Both methods showed that an optimal longitudinal position exists. However, the inviscid model over-predicted the lateral acceleration due to the difference in shock formation around the pin between the two approaches. The optimal location was predicted independent of solver, however the higher-fidelity solver predicted lower achievable lateral accelerations. This is due to the viscous interactions caused by the pin. The effect of projectile orientation has shown that shielding the pin leads to reduced effectiveness due to the wake of the fin enveloping the pin. When the pin is exposed to onset flow, the forces achieved are increased. There is also an increase in the achievable forces and moments with increasing Mach number.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper aims to demonstrate how a derived approach to case file analysis, influenced by the work of Michel Foucault and Dorothy E.Smith, can offer innovative means by which to study the relations between discourse and practices in child welfare. The article explores text-based forms of organization in histories of child protection in Finland and in Northern Ireland. It is focused on case file records in different organizational child protection contexts in two jurisdictions. Building on a previous article (Author 1 & 2: 2011), we attempt to demonstrate the potential of how the relations between practices and discourses –a majorly important theme for understanding child welfare social work – can be effectively analysed using a combination of two approaches This article is based on three different empirical studies from our two jurisdictions Northern Ireland (UK) and Finland; one study used Foucault; the other Smith and the third study sought to combine the methods. This article seeks to report on ongoing work in developing, for child welfare studies, ‘a history that speaks back’ as we have described it.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A detailed investigation on planar two dimensional metallodielectric dipole arrays with enhanced near-fields for sensing applications was carried out. Two approaches for enhancing the near-fields and increasing the quality factor were studied. The reactive power stored in the vicinity of the array at resonance increases rapidly with increasing periodicity. Higher quality factors are produced as a result. The excitation of the odd mode in the presence of a perturbation gives rise to a sharp resonance with near-field enhanced by at least an order of magnitude compared to unperturbed arrays. The trade-off between near-field enhancement and thermal losses was also studied, and the effect of supporting dielectric layers on thermal losses and quality factors were examined. Secondary transmissions due to the dielectric alone were found to enhance and reduce cyclically the quality factor as a function of the thickness of the dielectric material. The performance of a perturbed frequency selective surface in sensing nearby materials was investigated. Finally, unperturbed and perturbed arrays working at infrared frequencies were demonstrated experimentally. (C) 2011 Society of Photo-Optical Instrumentation Engineers (SPIE). [DOI: 10.1117/1.3604785]

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examines how the archaeology of historic Ireland has been interpreted. Two approaches to the history and archaeology of Ireland are identified. The first, the timeless past, has its roots in a neo-Lamarckian view of the past. This perspective was particularly developed in the work of geographer and ethnographer, Estyn Evans. The second view, associated in particular with a nationalist approach to Ireland’s past, looked to the west of the country where it was believed the culture had been preserved largely unchanged and in its purest form. The continuing impact of these frameworks upon the interpretation of rural settlement in the period 1200– 1700 is examined. It is argued that historians and archaeologists alike have underestimated the quality of buildings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The eluent droplet size defines the number of sampling compartments in a continuously operated annular electrochromatograph and therefore influences separation efficiency. In this work, an assembly of two capillaries, a feeding capillary on the top and a receiving capillary placed under it, has been investigated to control droplet size. The receiving capillary prevents the liquid droplet formation beyond a critical size, which reduces the volume of sampling compartment as compared with the case of the electrolyte flow driven solely by gravity. With a receiving capillary, the electrolyte droplet size was reduced from 1.5 to 0.46 mm. Further decrease of droplet size was not possible due to a so-called droplet jump upwards effect which has been observed on a hydrophilic glass surface with water. A typical electrolyte used in CAEC has high methanol content. In an attempt to improve the methanol-repellent properties of the glass surface, two approaches have been implemented: (i) self-assembled chemisorbed monolayers of an alkylsiloxane and (ii) fabrication of a nano-pin film. The methanol-repellent surface of the feeding capillary suppressed the droplet jump upwards effect. The surface remained methanol repellent in different solutions with lower polarity than that of water.