932 resultados para Mini-scale method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classical binary classification problem is investigatedwhen it is known in advance that the posterior probability function(or regression function) belongs to some class of functions. We introduceand analyze a method which effectively exploits this knowledge. The methodis based on minimizing the empirical risk over a carefully selected``skeleton'' of the class of regression functions. The skeleton is acovering of the class based on a data--dependent metric, especiallyfitted for classification. A new scale--sensitive dimension isintroduced which is more useful for the studied classification problemthan other, previously defined, dimension measures. This fact isdemonstrated by performance bounds for the skeleton estimate in termsof the new dimension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a method to deal with the constraints of the underwater medium for finding changes between sequences of underwater images. One of the main problems of underwater medium for automatically detecting changes is the low altitude of the camera when taking pictures. This emphasise the parallax effect between the images as they are not taken exactly at the same position. In order to solve this problem, we are geometrically registering the images together taking into account the relief of the scene

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Eukaryotic transcription is tightly regulated by transcriptional regulatory elements, even though these elements may be located far away from their target genes. It is now widely recognized that these regulatory elements can be brought in close proximity through the formation of chromatin loops, and that these loops are crucial for transcriptional regulation of their target genes. The chromosome conformation capture (3C) technique presents a snapshot of long-range interactions, by fixing physically interacting elements with formaldehyde, digestion of the DNA, and ligation to obtain a library of unique ligation products. Recently, several large-scale modifications to the 3C technique have been presented. Here, we describe chromosome conformation capture sequencing (4C-seq), a high-throughput version of the 3C technique that combines the 3C-on-chip (4C) protocol with next-generation Illumina sequencing. The method is presented for use in mammalian cell lines, but can be adapted to use in mammalian tissues and any other eukaryotic genome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel laboratory technique is proposed to investigate wave-induced fluid flow on the mesoscopic scale as a mechanism for seismic attenuation in partially saturated rocks. This technique combines measurements of seismic attenuation in the frequency range from 1 to 100?Hz with measurements of transient fluid pressure as a response of a step stress applied on top of the sample. We used a Berea sandstone sample partially saturated with water. The laboratory results suggest that wave-induced fluid flow on the mesoscopic scale is dominant in partially saturated samples. A 3-D numerical model representing the sample was used to verify the experimental results. Biot's equations of consolidation were solved with the finite-element method. Wave-induced fluid flow on the mesoscopic scale was the only attenuation mechanism accounted for in the numerical solution. The numerically calculated transient fluid pressure reproduced the laboratory data. Moreover, the numerically calculated attenuation, superposed to the frequency-independent matrix anelasticity, reproduced the attenuation measured in the laboratory in the partially saturated sample. This experimental?numerical fit demonstrates that wave-induced fluid flow on the mesoscopic scale and matrix anelasticity are the dominant mechanisms for seismic attenuation in partially saturated Berea sandstone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The agricultural potential is generally assessed and managed based on a one-dimensional vision of the soil profile, however, the increased appreciation of sustainable production has stimulated studies on faster and more accurate evaluation techniques and methods of the agricultural potential on detailed scales. The objective of this study was to investigate the possibility of using soil magnetic susceptibility for the identification of landscape segments on a detailed scale in the region of Jaboticabal, São Paulo State. The studied area has two slope curvatures: linear and concave, subdivided into three landscape segments: upper slope (US, concave), middle slope (MS, linear) and lower slope (LS, linear). In each of these segments, 20 points were randomly sampled from a database with 207 samples forming a regular grid installed in each landscape segment. The soil physical and chemical properties, CO2 emissions (FCO2) and magnetic susceptibility (MS) of the samples were evaluated represented by: magnetic susceptibility of air-dried fine earth (MS ADFE), magnetic susceptibility of the total sand fraction (MS TS) and magnetic susceptibility of the clay fraction (MS Cl) in the 0.00 - 0.15 m layer. The principal component analysis showed that MS is an important property that can be used to identify landscape segments, because the correlation of this property within the first principal component was high. The hierarchical cluster analysis method identified two groups based on the variables selected by principal component analysis; of the six selected variables, three were related to magnetic susceptibility. The landscape segments were differentiated similarly by the principal component analysis and by the cluster analysis using only the properties with higher discriminatory power. The cluster analysis of MS ADFE, MS TS and MS Cl allowed the formation of three groups that agree with the segment division established in the field. The grouping by cluster analysis indicated MS as a tool that could facilitate the identification of landscape segments and enable the mapping of more homogeneous areas at similar locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate modeling of flow instabilities requires computational tools able to deal with several interacting scales, from the scale at which fingers are triggered up to the scale at which their effects need to be described. The Multiscale Finite Volume (MsFV) method offers a framework to couple fine-and coarse-scale features by solving a set of localized problems which are used both to define a coarse-scale problem and to reconstruct the fine-scale details of the flow. The MsFV method can be seen as an upscaling-downscaling technique, which is computationally more efficient than standard discretization schemes and more accurate than traditional upscaling techniques. We show that, although the method has proven accurate in modeling density-driven flow under stable conditions, the accuracy of the MsFV method deteriorates in case of unstable flow and an iterative scheme is required to control the localization error. To avoid large computational overhead due to the iterative scheme, we suggest several adaptive strategies both for flow and transport. In particular, the concentration gradient is used to identify a front region where instabilities are triggered and an accurate (iteratively improved) solution is required. Outside the front region the problem is upscaled and both flow and transport are solved only at the coarse scale. This adaptive strategy leads to very accurate solutions at roughly the same computational cost as the non-iterative MsFV method. In many circumstances, however, an accurate description of flow instabilities requires a refinement of the computational grid rather than a coarsening. For these problems, we propose a modified iterative MsFV, which can be used as downscaling method (DMsFV). Compared to other grid refinement techniques the DMsFV clearly separates the computational domain into refined and non-refined regions, which can be treated separately and matched later. This gives great flexibility to employ different physical descriptions in different regions, where different equations could be solved, offering an excellent framework to construct hybrid methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to determine the location and relative strength of all transcription-factor binding sites in a genome is important both for a comprehensive understanding of gene regulation and for effective promoter engineering in biotechnological applications. Here we present a bioinformatically driven experimental method to accurately define the DNA-binding sequence specificity of transcription factors. A generalized profile was used as a predictive quantitative model for binding sites, and its parameters were estimated from in vitro-selected ligands using standard hidden Markov model training algorithms. Computer simulations showed that several thousand low- to medium-affinity sequences are required to generate a profile of desired accuracy. To produce data on this scale, we applied high-throughput genomics methods to the biochemical problem addressed here. A method combining systematic evolution of ligands by exponential enrichment (SELEX) and serial analysis of gene expression (SAGE) protocols was coupled to an automated quality-controlled sequence extraction procedure based on Phred quality scores. This allowed the sequencing of a database of more than 10,000 potential DNA ligands for the CTF/NFI transcription factor. The resulting binding-site model defines the sequence specificity of this protein with a high degree of accuracy not achieved earlier and thereby makes it possible to identify previously unknown regulatory sequences in genomic DNA. A covariance analysis of the selected sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Multiscale Finite Volume (MsFV) method has been developed to efficiently solve reservoir-scale problems while conserving fine-scale details. The method employs two grid levels: a fine grid and a coarse grid. The latter is used to calculate a coarse solution to the original problem, which is interpolated to the fine mesh. The coarse system is constructed from the fine-scale problem using restriction and prolongation operators that are obtained by introducing appropriate localization assumptions. Through a successive reconstruction step, the MsFV method is able to provide an approximate, but fully conservative fine-scale velocity field. For very large problems (e.g. one billion cell model), a two-level algorithm can remain computational expensive. Depending on the upscaling factor, the computational expense comes either from the costs associated with the solution of the coarse problem or from the construction of the local interpolators (basis functions). To ensure numerical efficiency in the former case, the MsFV concept can be reapplied to the coarse problem, leading to a new, coarser level of discretization. One challenge in the use of a multilevel MsFV technique is to find an efficient reconstruction step to obtain a conservative fine-scale velocity field. In this work, we introduce a three-level Multiscale Finite Volume method (MlMsFV) and give a detailed description of the reconstruction step. Complexity analyses of the original MsFV method and the new MlMsFV method are discussed, and their performances in terms of accuracy and efficiency are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The Beck Cognitive Insight Scale (BCIS) evaluates patients' self-report of their ability to detect and correct misinterpretation. Our study aims to confirm the factor structure and the convergent validity of the original scale in a French-speaking environment. METHOD: Outpatients (n = 158) suffering from schizophrenia or schizoaffective disorders fulfilled the BCIS. The 51 patients in Montpellier were equally assessed with the Positive and Negative Syndrome Scale (PANSS) by a psychiatrist who was blind of the BCIS scores. RESULTS: The fit indices of the confirmatory factor analysis validated the 2-factor solution reported by the developers of the scale with inpatients, and in another study with middle-aged and older outpatients. The BCIS composite index was significantly negatively correlated with the clinical insight item of the PANSS. CONCLUSIONS: The French translation of the BCIS appears to have acceptable psychometric properties and gives additional support to the scale, as well as cross-cultural validity for its use with outpatients suffering from schizophrenia or schizoaffective disorders. The correlation between clinical and composite index of cognitive insight underlines the multidimensional nature of clinical insight. Cognitive insight does not recover clinical insight but is a potential target for developing psychological treatments that will improve clinical insight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the results owing to short and discontinuous periods of rainfall data with the result that probabilistic equations cannot be applied. The Manning method yields a wide range of results because of its dependence on the roughness coefficient. The paleohydraulic method yielded higher values than the rational and Manning methods. However, it should be pointed out that it is possible that bigger boulders could have been moved had they existed. These discharge values are lower than those obtained by the geomorphological estimates, i.e. much closer to reality. The flood hazard maps were derived from the comprehensive geomorphological map. Three categories of hazard were established (very high, high and moderate) using flood energy, water height and velocity flow deduced from geomorphological and eyewitness reports.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims :¦Several studies have questioned the validity of separating the diagnosis of alcohol abuse from that of alcohol dependence, and the DSM-5 task force has proposed combining the criteria from these two diagnoses to assess a single category of alcohol use disorders (AUD). Furthermore, the DSM-5 task force has proposed including a new 2-symptom threshold and a severity scale based on symptom counts for the AUD diagnosis. The current study aimed to examine these modifications in a large population-based sample.¦Method :¦Data stemmed from an adult sample (N=2588 ; mean age 51.3 years (s.d.: 0.2), 44.9% female) of current and lifetime drinkers from the PsyCoLaus study, conducted in the Lausanne area in Switzerland. AUDs and validating variables were assessed using a semi-structured diagnostic interview for the assessment of alcohol¦and other major psychiatric disorders. First, the adequacy of the proposed 2- symptom threshold was tested by comparing threshold models at each possible cutoff and a linear model, in relation to different validating variables. The model with the smallest Akaike Criterion Information (AIC) value was established as the best¦model for each validating variable. Second, models with varying subsets of individual AUD symptoms were created to assess the associations between each symptom and the validating variables. The subset of symptoms with the smallest AIC value was established as the best subset for each validator.¦Results :¦1) For the majority of validating variables, the linear model was found to be the best fitting model. 2) Among the various subsets of symptoms, the symptoms most frequently associated with the validating variables were : a) drinking despite having knowledge of a physical or psychological problem, b) having had a persistent desire or unsuccessful efforts to cut down or control drinking and c) craving. The¦least frequent symptoms were : d) drinking in larger amounts or over a longer period than was intended, e) spending a great deal of time in obtaining, using or recovering from alcohol use and f) failing to fulfill major role obligations.¦Conclusions :¦The proposed DSM-5 2-symptom threshold did not receive support in our data. Instead, a linear AUD diagnosis was supported with individuals receiving an increasingly severe AUD diagnosis. Moreover, certain symptoms were more frequently associated with the validating variables, which suggests that these¦symptoms should be considered as more severe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The multiscale finite-volume (MSFV) method has been derived to efficiently solve large problems with spatially varying coefficients. The fine-scale problem is subdivided into local problems that can be solved separately and are coupled by a global problem. This algorithm, in consequence, shares some characteristics with two-level domain decomposition (DD) methods. However, the MSFV algorithm is different in that it incorporates a flux reconstruction step, which delivers a fine-scale mass conservative flux field without the need for iterating. This is achieved by the use of two overlapping coarse grids. The recently introduced correction function allows for a consistent handling of source terms, which makes the MSFV method a flexible algorithm that is applicable to a wide spectrum of problems. It is demonstrated that the MSFV operator, used to compute an approximate pressure solution, can be equivalently constructed by writing the Schur complement with a tangential approximation of a single-cell overlapping grid and incorporation of appropriate coarse-scale mass-balance equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Despite the fundamental role of ecosystem goods and services in sustaining human activities, there is no harmonized and internationally agreed method for including them in life cycle assessment (LCA). The main goal of this study was to develop a globally applicable and spatially resolved method for assessing land-use impacts on the erosion regulation ecosystem service.Methods: Soil erosion depends much on location. Thus, unlike conventional LCA, the endpoint method was regionalized at the grid-cell level (5 arc-minutes, approximately 10×10 km2) to reflect the spatial conditions of the site. Spatially explicit characterization factors were not further aggregated at broader spatial scales. Results and discussion: Life cycle inventory data of topsoil and topsoil organic carbon (SOC) losses were interpreted at the endpoint level in terms of the ultimate damage to soil resources and ecosystem quality. Human health damages were excluded from the assessment. The method was tested on a case study of five three-year agricultural rotations, two of them with energy crops, grown in several locations in Spain. A large variation in soil and SOC losses was recorded in the inventory step, depending on climatic and edaphic conditions. The importance of using a spatially explicit model and characterization factors is shown in the case study.Conclusions and outlook: The regionalized assessment takes into account the differences in soil erosion-related environmental impacts caused by the great variability of soils. Taking this regionalized framework as the starting point, further research should focus on testing the applicability of the method trough the complete life cycle of a product and on determining an appropriate spatial scale at which to aggregate characterization factors, in order to deal with data gaps on location of processes, especially in the background system. Additional research should also focus on improving reliability of the method by quantifying and, insofar as it is possible, reducing uncertainty.