993 resultados para sequential methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The cooperative interaction between transcription factors has a decisive role in the control of the fate of the eukaryotic cell. Computational approaches for characterizing cooperative transcription factors in yeast, however, are based on different rationales and provide a low overlap between their results. Because the wealth of information contained in protein interaction networks and regulatory networks has proven highly effective in elucidating functional relationships between proteins, we compared different sets of cooperative transcription factor pairs (predicted by four different computational methods) within the frame of those networks. Results: Our results show that the overlap between the sets of cooperative transcription factors predicted by the different methods is low yet significant. Cooperative transcription factors predicted by all methods are closer and more clustered in the protein interaction network than expected by chance. On the other hand, members of a cooperative transcription factor pair neither seemed to regulate each other nor shared similar regulatory inputs, although they do regulate similar groups of target genes. Conclusion: Despite the different definitions of transcriptional cooperativity and the different computational approaches used to characterize cooperativity between transcription factors, the analysis of their roles in the framework of the protein interaction network and the regulatory network indicates a common denominator for the predictions under study. The knowledge of the shared topological properties of cooperative transcription factor pairs in both networks can be useful not only for designing better prediction methods but also for better understanding the complexities of transcriptional control in eukaryotes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The aim of this report is to describe the main characteristics of the design, including response rates, of the Cornella Health Interview Survey Follow-up Study. Methods: The original cohort consisted of 2,500 subjects (1,263 women and 1,237 men) interviewed as part of the 1994 Cornella Health Interview Study. A record linkage to update the address and vital status of the cohort members was carried out using, first a deterministic method, and secondly a probabilistic one, based on each subject's first name and surnames. Subsequently, we attempted to locate the cohort members to conduct the phone follow-up interviews. A pilot study was carried out to test the overall feasibility and to modify some procedures before the field work began. Results: After record linkage, 2,468 (98.7%) subjects were successfully traced. Of these, 91 (3.6%) were deceased, 259 (10.3%) had moved to other towns, and 50 (2.0%) had neither renewed their last municipal census documents nor declared having moved. After using different strategies to track and to retain cohort members, we traced 92% of the CHIS participants. From them, 1,605 subjects answered the follow-up questionnaire. Conclusion: The computerized record linkage maximized the success of the follow-up that was carried out 7 years after the baseline interview. The pilot study was useful to increase the efficiency in tracing and interviewing the respondents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Pseudo-Spectral Time Domain (PSTD) method is an alternative time-marching method to classical leapfrog finite difference schemes inthe simulation of wave-like propagating phenomena. It is based on the fundamentals of the Fourier transform to compute the spatial derivativesof hyperbolic differential equations. Therefore, it results in an isotropic operator that can be implemented in an efficient way for room acousticssimulations. However, one of the first issues to be solved consists on modeling wall absorption. Unfortunately, there are no references in thetechnical literature concerning to that problem. In this paper, assuming real and constant locally reacting impedances, several proposals toovercome this problem are presented, validated and compared to analytical solutions in different scenarios.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The State of Iowa currently has approximately 69,000 miles of unpaved secondary roads. Due to the low traffic count on these unpaved o nts as ng e two dust ed d roads, paving with asphalt or Portland cement concrete is not economical. Therefore to reduce dust production, the use of dust suppressants has been utilized for decades. This study was conducted to evaluate the effectiveness of several widely used dust suppressants through quantitative field testing on two of Iowa’s most widely used secondary road surface treatments: crushed limestone rock and alluvial sand/gravel. These commercially available dust suppressants included: lignin sulfonate, calcium chloride, and soybean oil soapstock. These suppressants were applied to 1000 ft test sections on four unpaved roads in Story County, Iowa. Tduplicate field conditions, the suppressants were applied as a surface spray once in early June and again in late August or early September. The four unpaved roads included two with crushed limestone rock and two with alluvial sand/gravel surface treatmewell as high and low traffic counts. The effectiveness of the dust suppressants was evaluated by comparing the dust produced on treated and untreated test sections. Dust collection was scheduled for 1, 2, 4, 6, and 8 weeks after each application, for a total testiperiod of 16 weeks. Results of a cost analysis between annual dust suppressant application and biennial aggregate replacement indicated that the cost of the dust suppressant, its transportation, and application were relatively high when compared to that of thaggregate types. Therefore, the biennial aggregate replacement is considered more economical than annual dust suppressant application, although the application of annual dust suppressant reduced the cost of road maintenance by 75 %. Results of thecollection indicated that the lignin sulfonate suppressant outperformed calcium chloride and soybean oil soapstock on all four unpavroads, the effect of the suppressants on the alluvial sand/gravel surface treatment was less than that on the crushed limestone rock, the residual effects of all the products seem reasonably well after blading, and the combination of alluvial sand/gravel surface treatment anhigh traffic count caused dust reduction to decrease dramatically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this research was to summarize existing nondestructive test methods that have the potential to be used to detect materials-related distress (MRD) in concrete pavements. The various nondestructive test methods were then subjected to selection criteria that helped to reduce the size of the list so that specific techniques could be investigated in more detail. The main test methods that were determined to be applicable to this study included two stress-wave propagation techniques (impact-echo and spectral analysis of surface waves techniques), infrared thermography, ground penetrating radar (GPR), and visual inspection. The GPR technique was selected for a preliminary round of “proof of concept” trials. GPR surveys were carried out over a variety of portland cement concrete pavements for this study using two different systems. One of the systems was a state-of-the-art GPR system that allowed data to be collected at highway speeds. The other system was a less sophisticated system that was commercially available. Surveys conducted with both sets of equipment have produced test results capable of identifying subsurface distress in two of the three sites that exhibited internal cracking due to MRD. Both systems failed to detect distress in a single pavement that exhibited extensive cracking. Both systems correctly indicated that the control pavement exhibited negligible evidence of distress. The initial positive results presented here indicate that a more thorough study (incorporating refinements to the system, data collection, and analysis) is needed. Improvements in the results will be dependent upon defining the optimum number and arrangement of GPR antennas to detect the most common problems in Iowa pavements. In addition, refining highfrequency antenna response characteristics will be a crucial step toward providing an optimum GPR system for detecting materialsrelated distress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The stimulation of efferent renal sympathetic nerve activity induces sequential changes in renin secretion, sodium excretion, and renal hemodynamics that are proportional to the magnitude of the stimulation of sympathetic nerves. This study in men investigated the sequence of the changes in proximal and distal renal sodium handling, renal and systemic hemodynamics, as well as the hormonal profile occurring during a sustained activation of the sympathetic nervous system induced by various levels of lower body negative pressure (LBNP). METHODS: Ten healthy subjects were submitted to three levels of LBNP ranging between 0 and -22.5 mm Hg for one hour according to a triple crossover design, with a minimum of five days between each level of LBNP. Systemic and renal hemodynamics, renal water and sodium handling (using the endogenous lithium clearance technique), and the neurohormonal profile were measured before, during, and after LBNP. RESULTS: LBNP (0 to -22.5 mm Hg) induced an important hormonal response characterized by a significant stimulation of the sympathetic nervous system and gradual activations of the vasopressin and the renin-angiotensin systems. LBNP also gradually reduced water excretion and increased urinary osmolality. A significant decrease in sodium excretion was apparent only at -22.5 mm Hg. It was independent of any change in the glomerular filtration rate and was mediated essentially by an increased sodium reabsorption in the proximal tubule (a significant decrease in lithium clearance, P < 0.05). No significant change in renal hemodynamics was found at the tested levels of LBNP. As observed experimentally, there appeared to be a clear sequence of responses to LBNP, the neurohormonal response occurring before the changes in water and sodium excretion, these latter preceding any change in renal hemodynamics. CONCLUSIONS: These data show that the renal sodium retention developing during LBNP, and thus sympathetic nervous stimulation, is due mainly to an increase in sodium reabsorption by the proximal segments of the nephron. Our results in humans also confirm that, depending on its magnitude, LBNP leads to a step-by-step activation of neurohormonal, renal tubular, and renal hemodynamic responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluated 25 protocol variants of 14 independent computational methods for exon identification, transcript reconstruction and expression-level quantification from RNA-seq data. Our results show that most algorithms are able to identify discrete transcript components with high success rates but that assembly of complete isoform structures poses a major challenge even when all constituent elements are identified. Expression-level estimates also varied widely across methods, even when based on similar transcript models. Consequently, the complexity of higher eukaryotic genomes imposes severe limitations on transcript recall and splice product discrimination that are likely to remain limiting factors for the analysis of current-generation RNA-seq data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deterioration in portland cement concrete (PCC) pavements can occur due to distresses caused by a combination of traffic loads and weather conditions. Hot mix asphalt (HMA) overlay is the most commonly used rehabilitation technique for such deteriorated PCC pavements. However, the performance of these HMA overlaid pavements is hindered due to the occurrence of reflective cracking, resulting in significant reduction of pavement serviceability. Various fractured slab techniques, including rubblization, crack and seat, and break and seat are used to minimize reflective cracking by reducing the slab action. However, the design of structural overlay thickness for cracked and seated and rubblized pavements is difficult as the resulting structure is neither a “true” rigid pavement nor a “true” flexible pavement. Existing design methodologies use the empirical procedures based on the AASHO Road Test conducted in 1961. But, the AASHO Road Test did not employ any fractured slab technique, and there are numerous limitations associated with extrapolating its results to HMA overlay thickness design for fractured PCC pavements. The main objective of this project is to develop a mechanistic-empirical (ME) design approach for the HMA overlay thickness design for fractured PCC pavements. In this design procedure, failure criteria such as the tensile strain at the bottom of HMA layer and the vertical compressive strain on the surface of subgrade are used to consider HMA fatigue and subgrade rutting, respectively. The developed ME design system is also implemented in a Visual Basic computer program. A partial validation of the design method with reference to an instrumented trial project (IA-141, Polk County) in Iowa is provided in this report. Tensile strain values at the bottom of the HMA layer collected from the FWD testing at this project site are in agreement with the results obtained from the developed computer program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The members of the Iowa Concrete Paving Association, the National Concrete Pavement Technology Center Research Committee, and the Iowa Highway Research Board commissioned a study to examine alternative ways of developing transverse joints in portland cement concrete pavements. The present study investigated six separate variations of vertical metal strips placed above and below the dowels in conventional baskets. In addition, the study investigated existing patented assemblies and a new assembly developed in Spain and used in Australia. The metal assemblies were placed in a new pavement and allowed to stay in place for 30 days before the Iowa Department of Transportation staff terminated the test by directing the contractor to saw and seal the joints. This report describes the design, construction, testing, and conclusions of the project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we study older workers'(50-64) labor force transitions after a health/disability shock. We find that the probability of keeping working decreases with both age and severity of the shock. Moreover, we find strong interactions between age and severity in the 50-64 age range and none in the 30-49 age range. Regarding demographics we find that being female and married reduce the probability of keeping work. On the contrary, being main breadwinner, education and skill levels increase it. Interestingly, the effect of some demographics changes its sign when we look at transitions from inactivity to work. This is the case of being married or having a working spouse. Undoubtedly, leisure complementarities should play a role in the latter case. Since the data we use contains a very detailed information on disabilities, we are able to evaluate the marginal effect of each type of disability either in the probability of keeping working or in returning back to work. Some of these results may have strong policy implications.