29 resultados para Fixed-time artificial insemination


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article describes the development of a visual stimulus generator to be used in neuroscience experiments with invertebrates such as flies. The experiment consists in the visualization of a fixed image that is displaced horizontally according to the stimulus data. The system is capable of displaying 640 x 480 pixels with 256 intensity levels at 200 frames per second (FPS) on conventional raster monitors. To double the possible horizontal positioning possibilities from 640 to 1280, a novel technique is presented introducing artificial inter-pixel steps. The implementation consists in using two video frame buffers containing each a distinct view of the desired image pattern. This implementation generates a visual effect capable of doubling the horizontal positioning capabilities of the visual stimulus generator allowing more precise and movements more contiguous. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although praised for their rationality, humans often make poor decisions, even in simple situations. In the repeated binary choice experiment, an individual has to choose repeatedly between the same two alternatives, where a reward is assigned to one of them with fixed probability. The optimal strategy is to perseverate with choosing the alternative with the best expected return. Whereas many species perseverate, humans tend to match the frequencies of their choices to the frequencies of the alternatives, a sub-optimal strategy known as probability matching. Our goal was to find the primary cognitive constraints under which a set of simple evolutionary rules can lead to such contrasting behaviors. We simulated the evolution of artificial populations, wherein the fitness of each animat (artificial animal) depended on its ability to predict the next element of a sequence made up of a repeating binary string of varying size. When the string was short relative to the animats' neural capacity, they could learn it and correctly predict the next element of the sequence. When it was long, they could not learn it, turning to the next best option: to perseverate. Animats from the last generation then performed the task of predicting the next element of a non-periodical binary sequence. We found that, whereas animats with smaller neural capacity kept perseverating with the best alternative as before, animats with larger neural capacity, which had previously been able to learn the pattern of repeating strings, adopted probability matching, being outperformed by the perseverating animats. Our results demonstrate how the ability to make predictions in an environment endowed with regular patterns may lead to probability matching under less structured conditions. They point to probability matching as a likely by-product of adaptive cognitive strategies that were crucial in human evolution, but may lead to sub-optimal performances in other environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamics of a driven stadium-like billiard is considered using the formalism of discrete mappings. The model presents a resonant velocity that depends on the rotation number around fixed points and external boundary perturbation which plays an important separation rule in the model. We show that particles exhibiting Fermi acceleration (initial velocity is above the resonant one) are scaling invariant with respect to the initial velocity and external perturbation. However, initial velocities below the resonant one lead the particles to decelerate therefore unlimited energy growth is not observed. This phenomenon may be interpreted as a specific Maxwell's Demon which may separate fast and slow billiard particles. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: The present study analyzed xylitol concentrations in artificial saliva over time after application of varnishes containing 10% and 20% xylitol. Material and Methods: Fifteen bovine enamel specimens (8x4 mm) were randomly allocated to 3 groups (n=5/group), according to the type of varnish used: 10% xylitol, 20% xylitol and no xylitol (control). After varnish application (4 mg), specimens were immersed in vials containing 500 mu L of artificial saliva. Saliva samples were collected in different times (1, 8, 12, 16, 24, 48 and 72 h) and xylitol concentrations were analyzed. Data were assessed by two-way repeated-measures ANOVA (p<0.05). Results: Colorimetric analysis was not able to detect xylitol in saliva samples of the control group. Salivary xylitol concentrations were significantly higher up to 8 h after application of the 20% xylitol varnish. Thereafter, the 10% xylitol varnish released larger amounts of that polyol in artificial saliva. Conclusions: Despite the results in short-term, sustained xylitol releases could be obtained when the 10% xylitol varnish was used. These varnishes seem to be viable alternatives to increase salivary xylitol levels, and therefore, should be clinically tested to confirm their effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addressed the problem of water-demand forecasting for real-time operation of water supply systems. The present study was conducted to identify the best fit model using hourly consumption data from the water supply system of Araraquara, Sa approximate to o Paulo, Brazil. Artificial neural networks (ANNs) were used in view of their enhanced capability to match or even improve on the regression model forecasts. The ANNs used were the multilayer perceptron with the back-propagation algorithm (MLP-BP), the dynamic neural network (DAN2), and two hybrid ANNs. The hybrid models used the error produced by the Fourier series forecasting as input to the MLP-BP and DAN2, called ANN-H and DAN2-H, respectively. The tested inputs for the neural network were selected literature and correlation analysis. The results from the hybrid models were promising, DAN2 performing better than the tested MLP-BP models. DAN2-H, identified as the best model, produced a mean absolute error (MAE) of 3.3 L/s and 2.8 L/s for training and test set, respectively, for the prediction of the next hour, which represented about 12% of the average consumption. The best forecasting model for the next 24 hours was again DAN2-H, which outperformed other compared models, and produced a MAE of 3.1 L/s and 3.0 L/s for training and test set respectively, which represented about 12% of average consumption. DOI: 10.1061/(ASCE)WR.1943-5452.0000177. (C) 2012 American Society of Civil Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study reports the performance of a combined anaerobic-aerobic packed-bed reactor that can be used to treat domestic sewage. Initially, a bench-scale reactor was operated in three experimental phases. In the first phase, the anaerobic reactor was operated with an average organic matter removal efficiency of 77% for a hydraulic retention time (HRT) of 10 h. In the second phase, the reactor was operated with an anaerobic stage followed by an aerobic zone, resulting in a mean value of 91% efficiency. In the third and final phase, the anaerobic-aerobic reactor was operated with recirculation of the effluent of the reactor through the anaerobic zone. The system yielded mean total nitrogen removal percentages of 65 and 75% for recycle ratios (r) of 0.5 and 1.5, respectively, and the chemical oxygen demand (COD) removal efficiencies were higher than 90%. When the pilot-scale reactor was operated with an HRT of 12 h and r values of 1.5 and 3.0, its performance was similar to that observed in the bench-scale unit (92% COD removal for r = 3.0). However, the nitrogen removal was lower (55% N removal for r = 3.0) due to problems with the hydrodynamics in the aerobic zone. The anaerobic-aerobic fixed-bed reactor with recirculation of the liquid phase allows for concomitant carbon and nitrogen removal without adding an exogenous source of electron donors and without requiring any additional alkalinity supplementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective was to determine the effect of sequence of insemination after simultaneous thawing of multiple 0.5 mL semen straws on conception rate in suckled multiparous Nelore cows. The effect of this thawing procedure on in vitro sperm characteristics was also evaluated. All cows (N = 944) received the same timed AI protocol. Ten straws (0.5 mL) of frozen semen from the same batch were simultaneously thawed at 36 degrees C, for a minimum of 30 sec. One straw per cow was used for timed AI. Frozen semen from three Angus bulls was used. Timed AI records included sequence of insemination (first to tenth) and time of semen removal from thawing bath. For laboratory analyses, the same semen batches used in the field experiment were evaluated. Ten frozen straws from the same batch were thawed simultaneously in a thawing unit identical to that used in the field experiment. The following sperm characteristics were analyzed: sperm motility parameters, sperm thermal resistance, plasma and acrosomal membrane integrity, lipid peroxidation, chromatin structure, and sperm morphometry. Based on logistic regression, there were no significant effects of breeding group, body condition score, AI technician, and sire on conception rate, but there was an interaction between sire and straw group (P = 0.002). Semen from only one bull had decreased (P < 0.05) field fertility for the group of straws associated with the longest interval from thawing to AI. However, the results of the laboratory experiment were unable to explain the findings of the field experiment. Sperm width:length ratio of morphometric analysis was the single sperm characteristic with a significant interaction between sire and straw group (P = 0.02). It was concluded that sequence of insemination after simultaneous thawing of 10 semen straws can differently affect conception rates at timed AI, depending on the sire used. Nevertheless, the effects of this thawing environment on in vitro sperm characteristics, remain to be further investigated. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Studies in South-East Asia have suggested that early diagnosis and treatment with artesunate (AS) and mefloquine (MQ) combination therapy may reduce the transmission of Plasmodium falciparum malaria and the progression of MQ resistance. Methods: The effectiveness of a fixed-dose combination of AS and MQ (ASMQ) in reducing malaria transmission was tested in isolated communities of the Jurua valley in the Amazon region. Priority municipalities within the Brazilian Legal Amazon area were selected according to pre-specified criteria. Routine national malaria control programmatic procedures were followed. Existing health structures were reinforced and health care workers were trained to treat with ASMQ all confirmed falciparum malaria cases that match inclusion criteria. A local pharmacovigilance structure was implemented. Incidence of malaria and hospitalizations were recorded two years before, during, and after the fixed-dose ASMQ intervention. In total, between July 2006 and December 2008, 23,845 patients received ASMQ. Two statistical modelling approaches were applied to monthly time series of P. falciparum malaria incidence rates, P. falciparum/Plasmodium vivax infection ratio, and malaria hospital admissions rates. All the time series ranged from January 2004 to December 2008, whilst the intervention period span from July 2006 to December 2008. Results: The ASMQ intervention had a highly significant impact on the mean level of each time series, adjusted for trend and season, of 0.34 (95% CI 0.20 - 0.58) for the P. falciparum malaria incidence rates, 0.67 (95% CI 0.50 - 0.89) for the P. falciparum/P. vivax infection ratio, and 0.53 (95% CI 0.41 - 0.69) for the hospital admission rates. There was also a significant change in the seasonal (or monthly) pattern of the time series before and after intervention, with the elimination of the malaria seasonal peak in the rainy months of the years following the introduction of ASMQ. No serious adverse events relating to the use of fixed-dose ASMQ were reported. Conclusions: In the remote region of the Jurua valley, the early detection of malaria by health care workers and treatment with fixed-dose ASMQ was feasible and efficacious, and significantly reduced the incidence and morbidity of P. falciparum malaria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background The development of protocols for RNA extraction from paraffin-embedded samples facilitates gene expression studies on archival samples with known clinical outcome. Older samples are particularly valuable because they are associated with longer clinical follow up. RNA extracted from formalin-fixed paraffin-embedded (FFPE) tissue is problematic due to chemical modifications and continued degradation over time. We compared quantity and quality of RNA extracted by four different protocols from 14 ten year old and 14 recently archived (three to ten months old) FFPE breast cancer tissues. Using three spin column purification-based protocols and one magnetic bead-based protocol, total RNA was extracted in triplicate, generating 336 RNA extraction experiments. RNA fragment size was assayed by reverse transcription-polymerase chain reaction (RT-PCR) for the housekeeping gene glucose-6-phosphate dehydrogenase (G6PD), testing primer sets designed to target RNA fragment sizes of 67 bp, 151 bp, and 242 bp. Results Biologically useful RNA (minimum RNA integrity number, RIN, 1.4) was extracted in at least one of three attempts of each protocol in 86–100% of older and 100% of recently archived ("months old") samples. Short RNA fragments up to 151 bp were assayable by RT-PCR for G6PD in all ten year old and months old tissues tested, but none of the ten year old and only 43% of months old samples showed amplification if the targeted fragment was 242 bp. Conclusion All protocols extracted RNA from ten year old FFPE samples with a minimum RIN of 1.4. Gene expression of G6PD could be measured in all samples, old and recent, using RT-PCR primers designed for RNA fragments up to 151 bp. RNA quality from ten year old FFPE samples was similar to that extracted from months old samples, but quantity and success rate were generally higher for the months old group. We preferred the magnetic bead-based protocol because of its speed and higher quantity of extracted RNA, although it produced similar quality RNA to other protocols. If a chosen protocol fails to extract biologically useful RNA from a given sample in a first attempt, another attempt and then another protocol should be tried before excluding the case from molecular analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role played by human activity in coastline changes indicates a general tendency of retreating coasts, especially deltaic environments, as a result of the recent trend of sea level rise as well as the blockage of the transfer of sediments towards the coast, especially due to the construction of dams. This is particularly important in deltaic environments which are suffering a dramatic loss of area in the last decades. In contrast, in this paper, we report the origin and evolution of an anthropogenic delta, the Valo Grande delta, on the south-eastern Brazilian coast, whose origin is related to the opening of an artificial channel and the diversion of the main flow of the Ribeira de Iguape River. The methodology included the analysis of coastline changes, bathymetry and coring, which was used to determine the sedimentation rates and grain-size changes over time. The results allowed us to recognize the different facies of the anthropogenic delta and to establish its lateral and vertical depositional trends. Despite not being very frequent, anthropogenic deltas represent a favorable environment for the record of natural and anthropogenic changes in historical times and, thus, deserve more attention from researchers of different subjects

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ubiquity of time series data across almost all human endeavors has produced a great interest in time series data mining in the last decade. While dozens of classification algorithms have been applied to time series, recent empirical evidence strongly suggests that simple nearest neighbor classification is exceptionally difficult to beat. The choice of distance measure used by the nearest neighbor algorithm is important, and depends on the invariances required by the domain. For example, motion capture data typically requires invariance to warping, and cardiology data requires invariance to the baseline (the mean value). Similarly, recent work suggests that for time series clustering, the choice of clustering algorithm is much less important than the choice of distance measure used.In this work we make a somewhat surprising claim. There is an invariance that the community seems to have missed, complexity invariance. Intuitively, the problem is that in many domains the different classes may have different complexities, and pairs of complex objects, even those which subjectively may seem very similar to the human eye, tend to be further apart under current distance measures than pairs of simple objects. This fact introduces errors in nearest neighbor classification, where some complex objects may be incorrectly assigned to a simpler class. Similarly, for clustering this effect can introduce errors by “suggesting” to the clustering algorithm that subjectively similar, but complex objects belong in a sparser and larger diameter cluster than is truly warranted.We introduce the first complexity-invariant distance measure for time series, and show that it generally produces significant improvements in classification and clustering accuracy. We further show that this improvement does not compromise efficiency, since we can lower bound the measure and use a modification of triangular inequality, thus making use of most existing indexing and data mining algorithms. We evaluate our ideas with the largest and most comprehensive set of time series mining experiments ever attempted in a single work, and show that complexity-invariant distance measures can produce improvements in classification and clustering in the vast majority of cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a system for classification of industrial steel pieces by means of magnetic nondestructive device. The proposed classification system presents two main stages, online system stage and off-line system stage. In online stage, the system classifies inputs and saves misclassification information in order to perform posterior analyses. In the off-line optimization stage, the topology of a Probabilistic Neural Network is optimized by a Feature Selection algorithm combined with the Probabilistic Neural Network to increase the classification rate. The proposed Feature Selection algorithm searches for the signal spectrogram by combining three basic elements: a Sequential Forward Selection algorithm, a Feature Cluster Grow algorithm with classification rate gradient analysis and a Sequential Backward Selection. Also, a trash-data recycling algorithm is proposed to obtain the optimal feedback samples selected from the misclassified ones.