994 resultados para Filter methods
Resumo:
The relationship between electrophysiological and functional magnetic resonance imaging (fMRI) signals remains poorly understood. To date, studies have required invasive methods and have been limited to single functional regions and thus cannot account for possible variations across brain regions. Here we present a method that uses fMRI data and singe-trial electroencephalography (EEG) analyses to assess the spatial and spectral dependencies between the blood-oxygenation-level-dependent (BOLD) responses and the noninvasively estimated local field potentials (eLFPs) over a wide range of frequencies (0-256 Hz) throughout the entire brain volume. This method was applied in a study where human subjects completed separate fMRI and EEG sessions while performing a passive visual task. Intracranial LFPs were estimated from the scalp-recorded data using the ELECTRA source model. We compared statistical images from BOLD signals with statistical images of each frequency of the eLFPs. In agreement with previous studies in animals, we found a significant correspondence between LFP and BOLD statistical images in the gamma band (44-78 Hz) within primary visual cortices. In addition, significant correspondence was observed at low frequencies (<14 Hz) and also at very high frequencies (>100 Hz). Effects within extrastriate visual areas showed a different correspondence that not only included those frequency ranges observed in primary cortices but also additional frequencies. Results therefore suggest that the relationship between electrophysiological and hemodynamic signals thus might vary both as a function of frequency and anatomical region.
Resumo:
Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
Hematocrit (Hct) is one of the most critical issues associated with the bioanalytical methods used for dried blood spot (DBS) sample analysis. Because Hct determines the viscosity of blood, it may affect the spreading of blood onto the filter paper. Hence, accurate quantitative data can only be obtained if the size of the paper filter extracted contains a fixed blood volume. We describe for the first time a microfluidic-based sampling procedure to enable accurate blood volume collection on commercially available DBS cards. The system allows the collection of a controlled volume of blood (e.g., 5 or 10 μL) within several seconds. Reproducibility of the sampling volume was examined in vivo on capillary blood by quantifying caffeine and paraxanthine on 5 different extracted DBS spots at two different time points and in vitro with a test compound, Mavoglurant, on 10 different spots at two Hct levels. Entire spots were extracted. In addition, the accuracy and precision (n = 3) data for the Mavoglurant quantitation in blood with Hct levels between 26% and 62% were evaluated. The interspot precision data were below 9.0%, which was equivalent to that of a manually spotted volume with a pipet. No Hct effect was observed in the quantitative results obtained for Hct levels from 26% to 62%. These data indicate that our microfluidic-based sampling procedure is accurate and precise and that the analysis of Mavoglurant is not affected by the Hct values. This provides a simple procedure for DBS sampling with a fixed volume of capillary blood, which could eliminate the recurrent Hct issue linked to DBS sample analysis.
Resumo:
The need for upgrading a large number of understrength and obsolete bridges in the United States has been well documented in the literature. Through the performance of several Iowa DOT projects, the concept of strengthening bridges (simple and continuous spans) by post-tensioning has been developed. The purpose of this project was to investigate two additional strengthening alternatives that may be more efficient than post-tensioning in certain situations. The research program for each strengthening scheme included a literature review, laboratory testing of the strengthening scheme, and a finite-element analysis of the scheme. For clarity the two strengthening schemes are presented separately. In Part 1 of this report, the strengthening of existing steel stringers in composite steel beam concrete-deck bridges by providing partial end restraint was shown to be feasible. Part 2 of this report summarizes the research that was undertaken to strengthen the negative moment regions of continuous, composite bridges. Two schemes were investigated: post-compression of stringers and superimposed trusses within the stringers.
Resumo:
As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).
Resumo:
Abstract
Resumo:
Selostus: Valuma-aluetason mallisovellus suojakaistojen käytöstä eroosion torjunnassa
Resumo:
AASHTO has a standard test method for determining the specific gravity of aggregates. The people in the Aggregate Section of the Central Materials Laboratory perform the AASHTO T-85 test for AMRL inspections and reference samples. Iowa's test method 201B, for specific gravity determinations, requires more time and more care to perform than the AASHTO procedure. The major difference between the two procedures is that T-85 requires the sample to be weighed in water and 201B requires the 2 quart pycnometer jar. Efficiency in the Central Laboratory would be increased if the AASHTO procedure for coarse aggregate specific gravity determinations was adopted. The questions to be answered were: (1) Do the two procedures yield the same test results? (2) Do the two procedures yield the same precision? An experiment was conducted to study the different test methods. From the experimental results, specific gravity determinations by AASHTO T-85 method were found to correlate to those obtained by the Iowa 201B method with an R-squared value of 0.99. The absorption values correlated with an R-squared value of 0.98. The single operator precision was equivalent for the two methods. Hence, this procedure was recommended to be adopted in the Central Laboratory.
Resumo:
Most states, including Iowa, have a significant number of substandard bridges. This number will increase significantly unless some type of preventative maintenance is employed. Both the Iowa Department of Transportation and Iowa counties have successfully employed numerous maintenance, repair and rehabilitation (MR&R) strategies for correcting various types of deficiencies. However, successfully employed MR&R procedures are often not systematically documented or defined for those involved in bridge maintenance. This study addressed the need for a standard bridge MR&R manual for Iowa with emphasis for secondary road applications. As part of the study, bridge MR&R activities that are relevant to the state of Iowa have been systematically categorized into a manual, in a standardized format. Where pertinent, design guidelines have been presented. Material presented in this manual is divided into two major categories: 1) Repair and Rehabilitation of Bridge Superstructure Components, and 2) Repair and Rehabilitation of Bridge Substructure Components. There are multiple subcategories within both major categories that provide detailed information. Some of the detailed information includes step-by-step procedures for accomplishing MR&R activities, material specifications and detailed drawings where available. The source of information contained in the manual is public domain technical literature and information provided by Iowa County Engineers. A questionnaire was sent to all 99 counties in Iowa to solicit information and the research team personally solicited input from many Iowa counties as a follow-up to the questionnaire.