880 resultados para Weak Greedy Algorithms
Resumo:
SNP genotyping arrays have been developed to characterize single-nucleotide polymorphisms (SNPs) and DNA copy number variations (CNVs). The quality of the inferences about copy number can be affected by many factors including batch effects, DNA sample preparation, signal processing, and analytical approach. Nonparametric and model-based statistical algorithms have been developed to detect CNVs from SNP genotyping data. However, these algorithms lack specificity to detect small CNVs due to the high false positive rate when calling CNVs based on the intensity values. Association tests based on detected CNVs therefore lack power even if the CNVs affecting disease risk are common. In this research, by combining an existing Hidden Markov Model (HMM) and the logistic regression model, a new genome-wide logistic regression algorithm was developed to detect CNV associations with diseases. We showed that the new algorithm is more sensitive and can be more powerful in detecting CNV associations with diseases than an existing popular algorithm, especially when the CNV association signal is weak and a limited number of SNPs are located in the CNV.^
Resumo:
Como profesoras de Fonética y Fonología Inglesa hemos percibido que nuestros alumnos –hispano-hablantes- enfrentan una serie de dificultades en relación con la percepción y producción de las formas débiles de las palabras estructurales. En consecuencia, evaluamos la percepción y producción de las formas débiles y fuertes en nuestros alumnos, a través de una prueba de escucha y otra de habla, una vez que los estudiantes de primer año de la Facultad de Filosofía, Humanidades y Artes, Universidad Nacional de San Juan (UNSJ), habían finalizado el período de entrenamiento en estas formas. Para la recolección de datos utilizamos dos tipos de tests: uno de percepción y otro de producción. En general, los resultados corroboraron nuestra percepción: los alumnos continuaron teniendo mayor dificultad en la percepción y producción de las formas débiles de las palabras estructurales que en la de las formas fuertes, aún después de haber estado expuestos al entrenamiento explícito.
Resumo:
A preliminary composite depth section was generated for Site 704 by splicing Holes 704A and 704B together over the interval 0-350 mbsf (0-9 m.y.). High-resolution carbonate and opal data from the cores were correlated with the calcium and silicon signals from the GST logging run in Hole 704B to identify missing and disturbed intervals in the cores. Paleomagnetic and biostratigraphic age boundaries were then transferred to the composite depth records to obtain an age model, and sedimentation rates were calculated by linear interpolation between datums. Algorithms relating measured dry-bulk density to carbonate content and depth were generated to produce predicted values of density for every sample. Accumulation rates of bulk, carbonate, opal, and terrigenous sediment components were then computed to generate a record of sediment deposition on the Meteor Rise that has a resolution of better than 200,000 yr for the period from 8.6 to 1.0 m.y. From 8.6 to 2.5 m.y., bulk-accumulation rates on the Meteor Rise averaged less than 2 g/cm**2/1000 yr and were dominated by carbonate deposition. The first significant opal deposition (6.0 m.y.) punctuated a brief (less than 0.6 Ma) approach of the Polar Front Zone (PFZ) northward that heralded a period of increasing severity of periodic carbonate dissolution events (terrigenous maxima) that abruptly terminated at 4.8 m.y. (base of the Thvera Subchron), synchronous with the reflooding of the Mediterranean after the Messinian salinity crisis. From 4.8 to 2.5 m.y., carbonate again dominated deposition, and the PFZ was far south except during brief northward excursions bracketing 4.2-3.9, 3.3-2.9, and 2.8-2.7 m.y. At 2.5 m.y., all components of bulk-accumulation rates increased dramatically (up to 15 g/cm2/1000 yr), and by 2.4 m.y., a pattern of alternating, high-amplitude carbonate and opal cyclicity marked the initiation of rapid glacial to interglaci·l swings in the position of the PFZ, synchronous with the "onset" of major Northern Hemisphere glaciation. Both mass-accumulation rates and the amplitude of the cycles decreased by about 2 m.y., but opal accumulation rates remained high up through the base of the Jaramillo (0.98 m.y.). From 1.9 to 1 m.y., the record is characterized by moderate amplitude fluctuations in carbonate and opal. This record of opal accumulation rates is interpreted as a long-term "Polar Front Indicator" that monitors the advance and retreat of the opal-rich PFZ northward (southward) toward (away from) the Meteor Rise in the subantarctic sector of the South Atlantic Ocean. The timing of PFZ migrations in the subantarctic South Atlantic Ocean is remarkably similar to Pliocene-Pleistocene climate records deduced from benthic oxygen isotope records in the North Atlantic Ocean (Raymo et al., 1989, doi:10.1029/PA004i004p00413; Ruddiman et al., 1989, doi:10.1029/PA004i004p00353). These include northward migrations during "cold" intervals containing strong glacial isotope stages (2.4-2.3, 2.1-2.0, 1.95-1.55, 1.45-1.30 m.y. and at about 1.13 and 1.09 m.y.) and southward migrations during "warm" intervals containing weak glacial and/or strong interglacial stages (2.45-2.40, 2.30-2.10, 2.00-1.95, 1.52-1.45, 1.30-1.18, 1.11, and 1.06-0.93 m.y.). Although our preliminary composite record is not continuous (some stages are obviously missing), there is hope that future work will identify these missing intervals in the as yet incomplete Hole 704B and will extend this high-resolution Southern Hemisphere climate record back to 8.6 m.y.
Resumo:
The CoastColour project Round Robin (CCRR) project (http://www.coastcolour.org) funded by the European Space Agency (ESA) was designed to bring together a variety of reference datasets and to use these to test algorithms and assess their accuracy for retrieving water quality parameters. This information was then developed to help end-users of remote sensing products to select the most accurate algorithms for their coastal region. To facilitate this, an inter-comparison of the performance of algorithms for the retrieval of in-water properties over coastal waters was carried out. The comparison used three types of datasets on which ocean colour algorithms were tested. The description and comparison of the three datasets are the focus of this paper, and include the Medium Resolution Imaging Spectrometer (MERIS) Level 2 match-ups, in situ reflectance measurements and data generated by a radiative transfer model (HydroLight). The datasets mainly consisted of 6,484 marine reflectance associated with various geometrical (sensor viewing and solar angles) and sky conditions and water constituents: Total Suspended Matter (TSM) and Chlorophyll-a (CHL) concentrations, and the absorption of Coloured Dissolved Organic Matter (CDOM). Inherent optical properties were also provided in the simulated datasets (5,000 simulations) and from 3,054 match-up locations. The distributions of reflectance at selected MERIS bands and band ratios, CHL and TSM as a function of reflectance, from the three datasets are compared. Match-up and in situ sites where deviations occur are identified. The distribution of the three reflectance datasets are also compared to the simulated and in situ reflectances used previously by the International Ocean Colour Coordinating Group (IOCCG, 2006) for algorithm testing, showing a clear extension of the CCRR data which covers more turbid waters.
Resumo:
The goal of the AEgIS experiment is to measure the gravitational acceleration of antihydrogen – the simplest atom consisting entirely of antimatter – with the ultimate precision of 1%. We plan to verify the Weak Equivalence Principle (WEP), one of the fundamental laws of nature, with an antimatter beam. The experiment consists of a positron accumulator, an antiproton trap and a Stark accelerator in a solenoidal magnetic field to form and accelerate a pulsed beam of antihydrogen atoms towards a free-fall detector. The antihydrogen beam passes through a moir ́e deflectometer to measure the vertical displacement due to the gravitational force. A position and time sensitive hybrid detector registers the annihilation points of the antihydrogen atoms and their time-of-flight. The detection principle has been successfully tested with antiprotons and a miniature moir ́e deflectometer coupled to a nuclear emulsion detector.
Resumo:
This paper analyzes the newly institutionalized political system in democratizing Indonesia, with particular reference to the presidential system. Consensus has not yet been reached among scholars on whether the Indonesian president is strong or weak. This paper tries to answer this question by analyzing the legislative and partisan powers of the Indonesian president. It must be acknowledged, however, that these two functions do not on their own explain the strengths and weaknesses of the president. This paper suggests that in order to fully understand the presidential system in Indonesia, we need to take into account not just the president's legislative and partisan powers, but also the legislative process and the characteristics of coalition government.
Resumo:
We present the data structures and algorithms used in the approach for building domain ontologies from folksonomies and linked data. In this approach we extracts domain terms from folksonomies and enrich them with semantic information from the Linked Open Data cloud. As a result, we obtain a domain ontology that combines the emergent knowledge of social tagging systems with formal knowledge from Ontologies.
Resumo:
A multiplicative and a semi-mechanistic, BWB-type [Ball, J.T., Woodrow, I.E., Berry, J.A., 1987. A model predicting stomatalconductance and its contribution to the control of photosynthesis under different environmental conditions. In: Biggens, J. (Ed.), Progress in Photosynthesis Research, vol. IV. Martinus Nijhoff, Dordrecht, pp. 221–224.] algorithm for calculating stomatalconductance (gs) at the leaf level have been parameterised for two crop and two tree species to test their use in regional scale ozone deposition modelling. The algorithms were tested against measured, site-specific data for durum wheat, grapevine, beech and birch of different European provenances. A direct comparison of both algorithms showed a similar performance in predicting hourly means and daily time-courses of gs, whereas the multiplicative algorithm outperformed the BWB-type algorithm in modelling seasonal time-courses due to the inclusion of a phenology function. The re-parameterisation of the algorithms for local conditions in order to validate ozone deposition modelling on a European scale reveals the higher input requirements of the BWB-type algorithm as compared to the multiplicative algorithm because of the need of the former to model net photosynthesis (An)
Resumo:
A new method for detecting microcalcifications in regions of interest (ROIs) extracted from digitized mammograms is proposed. The top-hat transform is a technique based on mathematical morphology operations and, in this paper, is used to perform contrast enhancement of the mi-crocalcifications. To improve microcalcification detection, a novel image sub-segmentation approach based on the possibilistic fuzzy c-means algorithm is used. From the original ROIs, window-based features, such as the mean and standard deviation, were extracted; these features were used as an input vector in a classifier. The classifier is based on an artificial neural network to identify patterns belonging to microcalcifications and healthy tissue. Our results show that the proposed method is a good alternative for automatically detecting microcalcifications, because this stage is an important part of early breast cancer detection
Application of the Extended Kalman filter to fuzzy modeling: Algorithms and practical implementation
Resumo:
Modeling phase is fundamental both in the analysis process of a dynamic system and the design of a control system. If this phase is in-line is even more critical and the only information of the system comes from input/output data. Some adaptation algorithms for fuzzy system based on extended Kalman filter are presented in this paper, which allows obtaining accurate models without renounce the computational efficiency that characterizes the Kalman filter, and allows its implementation in-line with the process
Resumo:
In this paper we will see how the efficiency of the MBS simulations can be improved in two different ways, by considering both an explicit and implicit semi-recursive formulation. The explicit method is based on a double velocity transformation that involves the solution of a redundant but compatible system of equations. The high computational cost of this operation has been drastically reduced by taking into account the sparsity pattern of the system. Regarding this, the goal of this method is the introduction of MA48, a high performance mathematical library provided by Harwell Subroutine Library. The second method proposed in this paper has the particularity that, depending on the case, between 70 and 85% of the computation time is devoted to the evaluation of forces derivatives with respect to the relative position and velocity vectors. Keeping in mind that evaluating these derivatives can be decomposed into concurrent tasks, the main goal of this paper lies on a successful and straightforward parallel implementation that have led to a substantial improvement with a speedup of 3.2 by keeping all the cores busy in a quad-core processor and distributing the workload between them, achieving on this way a huge time reduction by doing an ideal CPU usage
Resumo:
It is known that the techniques under the topic of Soft Computing have a strong capability of learning and cognition, as well as a good tolerance to uncertainty and imprecision. Due to these properties they can be applied successfully to Intelligent Vehicle Systems; ITS is a broad range of technologies and techniques that hold answers to many transportation problems. The unmannedcontrol of the steering wheel of a vehicle is one of the most important challenges facing researchers in this area. This paper presents a method to adjust automatically a fuzzy controller to manage the steering wheel of a mass-produced vehicle; to reach it, information about the car state while a human driver is handling the car is taken and used to adjust, via iterative geneticalgorithms an appropriated fuzzy controller. To evaluate the obtained controllers, it will be considered the performance obtained in the track following task, as well as the smoothness of the driving carried out.
Resumo:
Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain.