845 resultados para Sign Data LMS algorithm.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report describes the results of the research project investigating the use of advanced field data acquisition technologies for lowa transponation agencies. The objectives of the research project were to (1) research and evaluate current data acquisition technologies for field data collection, manipulation, and reporting; (2) identify the current field data collection approach and the interest level in applying current technologies within Iowa transportation agencies; and (3) summarize findings, prioritize technology needs, and provide recommendations regarding suitable applications for future development. A steering committee consisting oretate, city, and county transportation officials provided guidance during this project. Technologies considered in this study included (1) data storage (bar coding, radio frequency identification, touch buttons, magnetic stripes, and video logging); (2) data recognition (voice recognition and optical character recognition); (3) field referencing systems (global positioning systems [GPS] and geographic information systems [GIs]); (4) data transmission (radio frequency data communications and electronic data interchange); and (5) portable computers (pen-based computers). The literature review revealed that many of these technologies could have useful applications in the transponation industry. A survey was developed to explain current data collection methods and identify the interest in using advanced field data collection technologies. Surveys were sent out to county and city engineers and state representatives responsible for certain programs (e.g., maintenance management and construction management). Results showed that almost all field data are collected using manual approaches and are hand-carried to the office where they are either entered into a computer or manually stored. A lack of standardization was apparent for the type of software applications used by each agency--even the types of forms used to manually collect data differed by agency. Furthermore, interest in using advanced field data collection technologies depended upon the technology, program (e.g.. pavement or sign management), and agency type (e.g., state, city, or county). The state and larger cities and counties seemed to be interested in using several of the technologies, whereas smaller agencies appeared to have very little interest in using advanced techniques to capture data. A more thorough analysis of the survey results is provided in the report. Recommendations are made to enhance the use of advanced field data acquisition technologies in Iowa transportation agencies: (1) Appoint a statewide task group to coordinate the effort to automate field data collection and reporting within the Iowa transportation agencies. Subgroups representing the cities, counties, and state should be formed with oversight provided by the statewide task group. (2) Educate employees so that they become familiar with the various field data acquisition technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). The types of problems related to the occurrence of freezing precipitation range from simple traffic delays to major accidents that involve fatalities. Freezing drizzle can also lead to economic impacts in communities with lost work hours, vehicular damage, and downed power lines. There are means for transportation agencies to perform preventive and reactive treatments to roadways, but freezing drizzle can be difficult to forecast accurately or even detect as weather radar and surface observation networks poorly observe these conditions. The detection of freezing precipitation is problematic and requires special instrumentation and analysis. The Federal Aviation Administration (FAA) development of aircraft anti-icing and deicing technologies has led to the development of a freezing drizzle algorithm that utilizes air temperature data and a specialized sensor capable of detecting ice accretion. However, at present, roadway ESSs are not capable of reporting freezing drizzle. This study investigates the use of the methods developed for the FAA and the National Weather Service (NWS) within a roadway environment to detect the occurrence of freezing drizzle using a combination of icing detection equipment and available ESS sensors. The work performed in this study incorporated the algorithm developed initially and further modified for work with the FAA for aircraft icing. The freezing drizzle algorithm developed for the FAA was applied using data from standard roadway ESSs. The work performed in this study lays the foundation for addressing the central question of interest to winter maintenance professionals as to whether it is possible to use roadside freezing precipitation detection (e.g., icing detection) sensors to determine the occurrence of pavement icing during freezing precipitation events and the rates at which this occurs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This User’s Guide serves as a reference for field personnel using the sign inventory data collection software tool. This tool was developed to simplify and standardize the collection and updating of sign inventory information. The software and collection methodology was developed by the Iowa DOT Sign Management Task Force and the Center for Transportation Research and Education at Iowa State University. Required Equipment -The data collection process requires both a portable computer and a global positioning system (GPS) device (connected via USB cable). Since computer battery performance varies, a DC power converter is recommended. A check-in/out process has also been established which allows updates to sign information from the central database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The temporal dynamics of species diversity are shaped by variations in the rates of speciation and extinction, and there is a long history of inferring these rates using first and last appearances of taxa in the fossil record. Understanding diversity dynamics critically depends on unbiased estimates of the unobserved times of speciation and extinction for all lineages, but the inference of these parameters is challenging due to the complex nature of the available data. Here, we present a new probabilistic framework to jointly estimate species-specific times of speciation and extinction and the rates of the underlying birth-death process based on the fossil record. The rates are allowed to vary through time independently of each other, and the probability of preservation and sampling is explicitly incorporated in the model to estimate the true lifespan of each lineage. We implement a Bayesian algorithm to assess the presence of rate shifts by exploring alternative diversification models. Tests on a range of simulated data sets reveal the accuracy and robustness of our approach against violations of the underlying assumptions and various degrees of data incompleteness. Finally, we demonstrate the application of our method with the diversification of the mammal family Rhinocerotidae and reveal a complex history of repeated and independent temporal shifts of both speciation and extinction rates, leading to the expansion and subsequent decline of the group. The estimated parameters of the birth-death process implemented here are directly comparable with those obtained from dated molecular phylogenies. Thus, our model represents a step towards integrating phylogenetic and fossil information to infer macroevolutionary processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is on state-of-the-art research efforts specific to infrastructure inventory/data collection with sign inventory as a case study. The development of an agency-wide sign inventory is based on feature inventory and location information. Specific to location, a quick and simple location acquisition tool is critical to tying assets to an accurate location-referencing system. This research effort provides a contrast between legacy referencing systems (route and milepost) and global positioning system- (GPS-) based techniques (latitude and longitude) integrated into a geographic information system (GIS) database. A summary comparison of field accuracies using a variety of consumer grade devices is also provided. This research, and the data collection tools developed, are critical in supporting the Iowa Department of Transportation (DOT) Statewide Sign Management System development effort. For the last two years, a Task Force has embarked on a comprehensive effort to develop a sign management system to improve sign quality, as well as to manage all aspects of signage, from request, ordering, fabricating, installing, maintaining, and ultimately removing, and to provide the ability to budget for these key assets on a statewide basis. This effort supported the development of a sign inventory tool and is the beginning of the development of a sign management system to support the Iowa DOT efforts in the consistent, cost effective, and objective decision making process when it comes to signs and their maintenance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews the history of sign language (SL) and the rationale for its use in children with profound auditory agnosia due to Landau-Kleffner syndrome (LKS), illustrated by studies of children and adults followed for many years and rare cases from the literature. The reasons that SL was successful and brought some children out of isolation while it could not be implemented in others are discussed. The nowadays earlier recognition and treatment of LKS and better awareness of the crucial need to maintain communication have certainly improved the outcome of affected children. Alternatives to oral language, even for less severe cases, are increasingly accepted. SL can be learned at different ages with a clear benefit, but the ambivalence of the patients and their families with the world and culture of the deaf may sometimes explain its refusal or limited acceptance. There are no data to support the fear that SL learning may delay or prevent oral language recovery in children with LKS. On the contrary, SL may even facilitate this recovery by stimulating functionally connected core language networks and by helping speech therapy and auditory training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cross-hole radar tomography is a useful tool for mapping shallow subsurface electrical properties viz. dielectric permittivity and electrical conductivity. Common practice is to invert cross-hole radar data with ray-based tomographic algorithms using first arrival traveltimes and first cycle amplitudes. However, the resolution of conventional standard ray-based inversion schemes for cross-hole ground-penetrating radar (GPR) is limited because only a fraction of the information contained in the radar data is used. The resolution can be improved significantly by using a full-waveform inversion that considers the entire waveform, or significant parts thereof. A recently developed 2D time-domain vectorial full-waveform crosshole radar inversion code has been modified in the present study by allowing optimized acquisition setups that reduce the acquisition time and computational costs significantly. This is achieved by minimizing the number of transmitter points and maximizing the number of receiver positions. The improved algorithm was employed to invert cross-hole GPR data acquired within a gravel aquifer (4-10 m depth) in the Thur valley, Switzerland. The simulated traces of the final model obtained by the full-waveform inversion fit the observed traces very well in the lower part of the section and reasonably well in the upper part of the section. Compared to the ray-based inversion, the results from the full-waveform inversion show significantly higher resolution images. At either side, 2.5 m distance away from the cross-hole plane, borehole logs were acquired. There is a good correspondence between the conductivity tomograms and the natural gamma logs at the boundary of the gravel layer and the underlying lacustrine clay deposits. Using existing petrophysical models, the inversion results and neutron-neutron logs are converted to porosity. Without any additional calibration, the values obtained for the converted neutron-neutron logs and permittivity results are very close and similar vertical variations can be observed. The full-waveform inversion provides in both cases additional information about the subsurface. Due to the presence of the water table and associated refracted/reflected waves, the upper traces are not well fitted and the upper 2 m in the permittivity and conductivity tomograms are not reliably reconstructed because the unsaturated zone is not incorporated into the inversion domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy, Total Variation (TV)- based energies and more recently non-local means. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm or fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n2) and O(1/√ε), while existing techniques are in O(1/n2) and O(1/√ε). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pyogenic liver abscess is a severe condition and a therapeutic challenge. Treatment failure may be due to an unrecognized ingested foreign body that migrated from the gastrointestinal tract. There has recently been a marked increase in the number of reported cases of this condition, but initial misdiagnosis as cryptogenic liver abscess still occurs in the majority of cases. We conducted the current study to characterize this entity and provide a diagnostic strategy applicable worldwide. To this end, data were collected from our case and from a systematic review that identified 59 well-described cases. Another systematic review identified series of cryptogenic-and Asian Klebsiella-liver abscess; these data were pooled and compared with the data from the cases of migrated foreign body liver abscess. The review points out the low diagnostic accuracy of history taking, modern imaging, and even surgical exploration. A fistula found through imaging procedures or endoscopy warrants surgical exploration. Findings suggestive of foreign body migration are symptoms of gastrointestinal perforation, computed tomography demonstration of a thickened gastrointestinal wall in continuity with the abscess, and adhesions seen during surgery. Treatment failure, left lobe location, unique location (that is, only 1 abscess location within the liver), and absence of underlying conditions also point to the diagnosis, as shown by comparison with the cryptogenic liver abscess series. This study demonstrates that migrated foreign body liver abscess is a specific entity, increasingly reported. It usually is not cured when unrecognized, and diagnosis is mainly delayed. This study provides what we consider the best available evidence for timely diagnosis with worldwide applicability. Increased awareness is required to treat this underestimated condition effectively, and further studies are needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A statewide study was performed to develop regional regression equations for estimating selected annual exceedance- probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedanceprobability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized leastsquares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized leastsquares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97.9 percent for flood region 2, and 92.4 to 96.0 percent for flood region 3. The regression equations are applicable only to stream sites in Iowa with flows not significantly affected by regulation, diversion, channelization, backwater, or urbanization and with basin characteristics within the range of those used to develop the equations. These regression equations will be implemented within the U.S. Geological Survey StreamStats Web-based geographic information system tool. StreamStats allows users to click on any ungaged site on a river and compute estimates of the eight selected statistics; in addition, 90-percent prediction intervals and the measured basin characteristics for the ungaged sites also are provided by the Web-based tool. StreamStats also allows users to click on any streamgage in Iowa and estimates computed for these eight selected statistics are provided for the streamgage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: In children with cystic fibrosis (CF), low immunoglobulin (IgG) levels have been reported to be associated with significantly less severe lung disease. However, decreased IgG can be a sign for common variable immunodeficiency (CVID) and affect clinical outcome. The aim of this study was to analyze clinical and serological data of patients having low IgG levels in routine blood tests at annual assessment, particularly their antibody response to polysaccharide antigens. Method: Retrospective chart review of demographic data of CF patients followed at the pediatric CF clinic throughout 2009. Clinical parameters (genotype, pancreas sufficiency, FEV1), presence of Pseudomonas aeruginosa (PA) and number of exacerbations per year were correlated with immunoglobulin and vaccination antibodies levels (antibodies to pneumococcal serotypes 14, 19, 23, 1, 5 and 7F measured by enzyme-linked immune-sorbent assay). Results: 4 out of 60 patients (6.7%) had lower IgG-levels for age. Ages ranged from 1 year 8 months to 11 years, 2 boys, 2 girls. Three patients were delF508 homozygotes, one heterozygote composite delF508/G542X. All were pancreatic insufficient. FEV1 ranged from 74 to 108%. One patient never had colonization by PA, 2 had intermittent PA colonization and one was chronically infected. After conjugated vaccination all patients had protective antibodies against serotypes 14, 19, 23F. For serotypes not included in the vaccine, only one patient had protective titers for 1 out of 3 serotypes. None of the patients had received unconjugated pneumococcal vaccine. There was no significant clinical difference in FEV1, PA colonization or number of exacerbations according to IgG and vaccination antibody levels. Conclusion: Cystic Fibrosis patients with low immunoglobulin levels have normal antibody response to protein antigens. However, despite recurrent infections, there seems to be delayed or deficient antibody response to polysaccharide antigens. Prospective studies are needed to evaluate the development of polysaccharide antibody responses in CF-patients to monitor for CVID. With early detection of CF by newborn screening program, long term follow up could be started early in childhood.