970 resultados para STATIONARY SECTORIAL SAMPLER
Resumo:
We present MBIS (Multivariate Bayesian Image Segmentation tool), a clustering tool based on the mixture of multivariate normal distributions model. MBIS supports multichannel bias field correction based on a B-spline model. A second methodological novelty is the inclusion of graph-cuts optimization for the stationary anisotropic hidden Markov random field model. Along with MBIS, we release an evaluation framework that contains three different experiments on multi-site data. We first validate the accuracy of segmentation and the estimated bias field for each channel. MBIS outperforms a widely used segmentation tool in a cross-comparison evaluation. The second experiment demonstrates the robustness of results on atlas-free segmentation of two image sets from scan-rescan protocols on 21 healthy subjects. Multivariate segmentation is more replicable than the monospectral counterpart on T1-weighted images. Finally, we provide a third experiment to illustrate how MBIS can be used in a large-scale study of tissue volume change with increasing age in 584 healthy subjects. This last result is meaningful as multivariate segmentation performs robustly without the need for prior knowledge.
Resumo:
This case study deals with a rock face monitoring in urban areas using a Terrestrial Laser Scanner. The pilot study area is an almost vertical, fifty meter high cliff, on top of which the village of Castellfollit de la Roca is located. Rockfall activity is currently causing a retreat of the rock face, which may endanger the houses located at its edge. TLS datasets consist of high density 3-D point clouds acquired from five stations, nine times in a time span of 22 months (from March 2006 to January 2008). The change detection, i.e. rockfalls, was performed through a sequential comparison of datasets. Two types of mass movement were detected in the monitoring period: (a) detachment of single basaltic columns, with magnitudes below 1.5 m3 and (b) detachment of groups of columns, with magnitudes of 1.5 to 150 m3. Furthermore, the historical record revealed (c) the occurrence of slab failures with magnitudes higher than 150 m3. Displacements of a likely slab failure were measured, suggesting an apparent stationary stage. Even failures are clearly episodic, our results, together with the study of the historical record, enabled us to estimate a mean detachment of material from 46 to 91.5 m3 year¿1. The application of TLS considerably improved our understanding of rockfall phenomena in the study area.
Resumo:
Usually the measurement of multi-segment foot and ankle complex kinematics is done with stationary motion capture devices which are limited to use in a gait laboratory. This study aimed to propose and validate a wearable system to measure the foot and ankle complex joint angles during gait in daily conditions, and then to investigate its suitability for clinical evaluations. The foot and ankle complex consisted of four segments (shank, hindfoot, forefoot, and toes), with an inertial measurement unit (3D gyroscopes and 3D accelerometers) attached to each segment. The angles between the four segments were calculated in the sagittal, coronal, and transverse planes using a new algorithm combining strap-down integration and detection of low-acceleration instants. To validate the joint angles measured by the wearable system, three subjects walked on a treadmill for five minutes at three different speeds. A camera-based stationary system that used a cluster of markers on each segment was used as a reference. To test the suitability of the system for clinical evaluation, the joint angle ranges were compared between a group of 10 healthy subjects and a group of 12 patients with ankle osteoarthritis, during two 50-m walking trials where the wearable system was attached to each subject. On average, over all joints and walking speeds, the RMS differences and correlation coefficients between the angular curves obtained using the wearable system and the stationary system were 1 deg and 0.93, respectively. Moreover, this system was able to detect significant alteration of foot and ankle function between the group of patients with ankle osteoarthritis and the group of healthy subjects. In conclusion, this wearable system was accurate and suitable for clinical evaluation when used to measure the multi-segment foot and ankle complex kinematics during long-distance walks in daily life conditions.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
The relationship between Lamb weather types and long-term changes in flood frequency, River Eden, UK
Resumo:
Research has found that both flood magnitude and frequency in the UK may have increased over the last five decades. However, evaluating whether or not this is a systematic trend is difficult because of the lack of longer records. Here we compile and consider an extreme flood record that extends back to 1770. Since 1770, there have been 137 recorded extreme floods. However, over this period, there is not a unidirectional trend of rising extreme flood risk over time. Instead, there are clear flood-rich and flood-poor periods. Three main flood-rich periods were identified: 18731904, 19231933, and 1994 onwards. To provide a first analysis of what is driving these periods, and given the paucity of more sophisticated datasets that extend back to the 18th century, objective Lamb weather types were used. Of the 27 objective Lamb weather types, only 11 could be associated with the extreme floods during the gauged period, and only 5 of these accounted for > 80% of recorded extreme floods The importance of these five weather types over a longer timescale for flood risk in Carlisle was assessed, through calculating the proportion of each hydrological year classified as being associated with these flood-generating weather types. Two periods clearly had more than the average proportions of the year classified as one of the flood causing weather types; 19001940 and 19832007; and these two periods both contained flood-rich hydrological records. Thus, the analysis suggests that systematic organisation of the North Atlantic climate system may be manifest as periods of elevated and reduced flood risk, an observation that has major implications for analyses that assume that climatic drivers of flood risk can be either statistically stationary or are following a simple trend. Copyright (c) 2011 Royal Meteorological Society
Resumo:
The Iowa Department of Transportation Materials Laboratory personnel announced in early 1982 a process to produce a road deicer consisting of sand grains coated with calcium magnesium acetate (CMA). From that point forward the Iowa DOT began searching for a means of economically producing CMA to their concept. During 1983 and 1984 the first attempts devised for commercially producing CMA were attempted by the W.G. Block Company, Davenport, Iowa, under Iowa Highway Research Board Project HR-253. This first attempt at commercially producing CMA was accomplished by the use of concrete transit mixer equipment. Even though this procedure proved successful in the batch mixing of CMA, the need for higher production rates to reduce the cost per ton still existed. During the fall of 1984, Cedarapids Inc, Cedar Rapids, Iowa, proposed to Iowa DOT personnel the application of their technology to a continuous mixing concept for CMA. Arrangements were made for the continuous test mixing of 60 to 100 tons of CMA/sand deicer. This report covers the production effort, description and results of procedures outlined in Cedarapids Inc's proposal of September 19, 1984. The objectives of this research were: 1. To produce the CMA/sand deicer concept on a continuous mixing basis to Iowa DOT CMA concentration levels. 2. To evaluate the results of preheating the carrying vehicle (sand) prior to CMA ingredient introduction. 3. To analyze the feasibility of production equipment and procedures necessary for portable and/or stationary applications of continuous mixing concepts.
Resumo:
BACKGROUND: The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. RESULTS: Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. CONCLUSION: ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.
Resumo:
The AASHO specifications for highway bridges require that in designing a bridge, the live load must be multiplied by an impact factor for which a formula is given, dependent only upon the length of the bridge. This formula is a result of August Wohler's tests on fatigue in metals, in which he determined that metals which are subjected to large alternating loads will ultimately fail at lower stresses than those which are subjected only to continuous static loads. It is felt by some investigators that this present impact factor is not realistic, and it is suggested that a consideration of the increased stress due to vibrations caused by vehicles traversing the span would result in a more realistic impact factor than now exists. Since the current highway program requires a large number of bridges to be built, the need for data on dynamic behavior of bridges is apparent. Much excellent material has already been gathered on the subject, but many questions remain unanswered. This work is designed to investigate further a specific corner of that subject, and it is hoped that some useful light may be shed on the subject. Specifically this study hopes to correlate, by experiment on a small scale test bridge, the upper limits of impact utilizing a stationary, oscillating load to represent axle loads moving past a given point. The experiments were performed on a small scale bridge which is located in the basement of the Iowa Engineering Experiment Station. The bridge is a 25 foot simply supported span, 10 feet wide, supported by four beams with a composite concrete slab. It is assumed that the magnitude of the predominant forcing function is the same as the magnitude of the dynamic force produced by a smoothly rolling load, which has a frequency determined by the passage of axles. The frequency of passage of axles is defined as the speed of the vehicle divided by the axle spacing. Factors affecting the response of the bridge to this forcing function are the bridge stiffness and mass, which determine the natural frequency, and the effects of solid damping due to internal structural energy dissipation.
Resumo:
Quality granular materials suitable for building all-weather roads are not uniformly distributed throughout the state of Iowa. For this reason the Iowa Highway Research Board has sponsored a number of research programs for the purpose of developing new and effective methods for making use of whatever materials are locally available. This need is ever more pressing today due to the decreasing availability of road funds and quality materials, and the increasing costs of energy and all types of binder materials. In the 1950s, Professor L. H. Csanyi of Iowa State University had demonstrated both in the laboratory and in the field, in Iowa and in a number of foreign countries, the effectiveness of preparing low cost mixes by stabilizing ungraded local aggregates such as gravel, sand and loess with asphalt cements using the foamed asphalt process. In this process controlled foam was produced by introducing saturated steam at about 40 psi into heated asphalt cement at about 25 psi through a specially designed and properly adjusted nozzle. The reduced viscosity and the increased volume and surface energy in the foamed asphalt allowed intimate coating and mixing of cold, wet aggregates or soils. Through the use of asphalt cements in a foamed state, materials normally considered unsuitable could be used in the preparation of mixes for stabilized bases and surfaces for low traffic road construction. By attaching the desired number of foam nozzles, the foamed asphalt can be used in conjunction with any type of mixing plant, either stationary or mobile, batch or continuous, central plant or in-place soil stabilization.
Resumo:
Limited antimicrobial agents are available for the treatment of implant-associated infections caused by fluoroquinolone-resistant Gram-negative bacilli. We compared the activities of fosfomycin, tigecycline, colistin, and gentamicin (alone and in combination) against a CTX-M15-producing strain of Escherichia coli (Bj HDE-1) in vitro and in a foreign-body infection model. The MIC and the minimal bactericidal concentration in logarithmic phase (MBC(log)) and stationary phase (MBC(stat)) were 0.12, 0.12, and 8 μg/ml for fosfomycin, 0.25, 32, and 32 μg/ml for tigecycline, 0.25, 0.5, and 2 μg/ml for colistin, and 2, 8, and 16 μg/ml for gentamicin, respectively. In time-kill studies, colistin showed concentration-dependent activity, but regrowth occurred after 24 h. Fosfomycin demonstrated rapid bactericidal activity at the MIC, and no regrowth occurred. Synergistic activity between fosfomycin and colistin in vitro was observed, with no detectable bacterial counts after 6 h. In animal studies, fosfomycin reduced planktonic counts by 4 log(10) CFU/ml, whereas in combination with colistin, tigecycline, or gentamicin, it reduced counts by >6 log(10) CFU/ml. Fosfomycin was the only single agent which was able to eradicate E. coli biofilms (cure rate, 17% of implanted, infected cages). In combination, colistin plus tigecycline (50%) and fosfomycin plus gentamicin (42%) cured significantly more infected cages than colistin plus gentamicin (33%) or fosfomycin plus tigecycline (25%) (P < 0.05). The combination of fosfomycin plus colistin showed the highest cure rate (67%), which was significantly better than that of fosfomycin alone (P < 0.05). In conclusion, the combination of fosfomycin plus colistin is a promising treatment option for implant-associated infections caused by fluoroquinolone-resistant Gram-negative bacilli.
Resumo:
Precession electron diffraction (PED) is a hollow cone non-stationary illumination technique for electron diffraction pattern collection under quasikinematicalconditions (as in X-ray Diffraction), which enables “ab-initio” solving of crystalline structures of nanocrystals. The PED technique is recently used in TEMinstruments of voltages 100 to 300 kV to turn them into true electron iffractometers, thus enabling electron crystallography. The PED technique, when combined with fast electron diffraction acquisition and pattern matching software techniques, may also be used for the high magnification ultra-fast mapping of variable crystal orientations and phases, similarly to what is achieved with the Electron Backscatter Diffraction (EBSD) technique in Scanning ElectronMicroscopes (SEM) at lower magnifications and longer acquisition times.
Resumo:
Solid phase microextraction (SPME) has been widely used for many years in various applications, such as environmental and water samples, food and fragrance analysis, or biological fluids. The aim of this study was to suggest the SPME method as an alternative to conventional techniques used in the evaluation of worker exposure to benzene, toluene, ethylbenzene, and xylene (BTEX). Polymethylsiloxane-carboxen (PDMS/CAR) showed as the most effective stationary phase material for sorbing BTEX among other materials (polyacrylate, PDMS, PDMS/divinylbenzene, Carbowax/divinylbenzene). Various experimental conditions were studied to apply SPME to BTEX quantitation in field situations. The uptake rate of the selected fiber (75 μm PDMS/CAR) was determined for each analyte at various concentrations, relative humidities, and airflow velocities from static (calm air) to dynamic (>200 cm/s) conditions. The SPME method also was compared with the National Institute of Occupational Safety and Health method 1501. Unlike the latter, the SPME approach fulfills the new requirement for the threshold limit value-short term exposure limit (TLV-STEL) of 2.5 ppm for benzene (8 mg/m3).
Resumo:
A headspace solid-phase microextraction procedure (HS-SPME) was developed for the profiling of traces present in 3,4-methylenedioxymethylampethamine (MDMA). Traces were first extracted using HS-SPME and then analyzed by gas chromatography-mass spectroscopy (GC-MS). The HS-SPME conditions were optimized using varying conditions. Optimal results were obtained when 40 mg of crushed MDMA sample was heated at 80 °C for 15 min, followed by extraction at 80 °C for 15 min with a polydimethylsiloxane/divinylbenzene coated fibre. A total of 31 compounds were identified as traces related to MDMA synthesis, namely precursors, intermediates or by-products. In addition some fatty acids used as tabletting materials and caffeine used as adulterant, were also detected. The use of a restricted set of 10 target compounds was also proposed for developing a screening tool for clustering samples having close profile. 114 seizures were analyzed using an SPME auto-sampler (MultiPurpose Samples MPS2), purchased from Gerstel GMBH & Co. (Germany), and coupled to GC-MS. The data was handled using various pre-treatment methods, followed by the study of similarities between sample pairs based on the Pearson correlation. The results show that HS-SPME, coupled with the suitable statistical method is a powerful tool for distinguishing specimens coming from the same seizure and specimens coming from different seizures. This information can be used by law enforcement personnel to visualize the ecstasy distribution network as well as the clandestine tablet manufacturing.
Resumo:
PURPOSE: Congenital stationary night blindness (CSNB) is a clinically and genetically heterogeneous retinal disease. Although electroretinographic (ERG) measurements can discriminate clinical subgroups, the identification of the underlying genetic defects has been complicated for CSNB because of genetic heterogeneity, the uncertainty about the mode of inheritance, and time-consuming and costly mutation scanning and direct sequencing approaches. METHODS: To overcome these challenges and to generate a time- and cost-efficient mutation screening tool, the authors developed a CSNB genotyping microarray with arrayed primer extension (APEX) technology. To cover as many mutations as possible, a comprehensive literature search was performed, and DNA samples from a cohort of patients with CSNB were first sequenced directly in known CSNB genes. Subsequently, oligonucleotides were designed representing 126 sequence variations in RHO, CABP4, CACNA1F, CACNA2D4, GNAT1, GRM6, NYX, PDE6B, and SAG and spotted on the chip. RESULTS: Direct sequencing of genes known to be associated with CSNB in the study cohort revealed 21 mutations (12 novel and 9 previously reported). The resultant microarray containing oligonucleotides, which allow to detect 126 known and novel mutations, was 100% effective in determining the expected sequence changes in all known samples assessed. In addition, investigation of 34 patients with CSNB who were previously not genotyped revealed sequence variants in 18%, of which 15% are thought to be disease-causing mutations. CONCLUSIONS: This relatively inexpensive first-pass genetic testing device for patients with a diagnosis of CSNB will improve molecular diagnostics and genetic counseling of patients and their families and gives the opportunity to analyze whether, for example, more progressive disorders such as cone or cone-rod dystrophies underlie the same gene defects.
Resumo:
A simple method determining airborne monoethanolamine has been developed. Monoethanolamine determination has traditionally been difficult due to analytical separation problems. Even in recent sophisticated methods, this difficulty remains as the major issue often resulting in time-consuming sample preparations. Impregnated glass fiber filters were used for sampling. Desorption of monoethanolamine was followed by capillary GC analysis and nitrogen phosphorous selective detection. Separation was achieved using a specific column for monoethanolamines (35% diphenyl and 65% dimethyl polysiloxane). The internal standard was quinoline. Derivatization steps were not needed. The calibration range was 0.5-80 μg/mL with a good correlation (R(2) = 0.996). Averaged overall precisions and accuracies were 4.8% and -7.8% for intraday (n = 30), and 10.5% and -5.9% for interday (n = 72). Mean recovery from spiked filters was 92.8% for the intraday variation, and 94.1% for the interday variation. Monoethanolamine on stored spiked filters was stable for at least 4 weeks at 5°C. This newly developed method was used among professional cleaners and air concentrations (n = 4) were 0.42 and 0.17 mg/m(3) for personal and 0.23 and 0.43 mg/m(3) for stationary measurements. The monoethanolamine air concentration method described here was simple, sensitive, and convenient both in terms of sampling and analytical analysis.