958 resultados para Probabilistic generalization


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Debris flows and related landslide processes occur in many regions all over Norway and pose a significant hazard to inhabited areas. Within the framework of the development of a national debris flows susceptibility map, we are working on a modeling approach suitable for Norway with a nationwide coverage. The discrimination of source areas is based on an index approach, which includes topographic parameters and hydrological settings. For the runout modeling, we use the Flow-R model (IGAR, University of Lausanne), which is based on combined probabilistic and energetic algorithms for the assessment of the spreading of the flow and maximum runout distances. First results for different test areas have shown that runout distances can be modeled reliably. For the selection of source areas, however, additional factors have to be considered, such as the lithological and quaternary geological setting, in order to accommodate the strong variation in debris flow activity in the different geological, geomorphological and climate regions of Norway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The temporal dynamics of species diversity are shaped by variations in the rates of speciation and extinction, and there is a long history of inferring these rates using first and last appearances of taxa in the fossil record. Understanding diversity dynamics critically depends on unbiased estimates of the unobserved times of speciation and extinction for all lineages, but the inference of these parameters is challenging due to the complex nature of the available data. Here, we present a new probabilistic framework to jointly estimate species-specific times of speciation and extinction and the rates of the underlying birth-death process based on the fossil record. The rates are allowed to vary through time independently of each other, and the probability of preservation and sampling is explicitly incorporated in the model to estimate the true lifespan of each lineage. We implement a Bayesian algorithm to assess the presence of rate shifts by exploring alternative diversification models. Tests on a range of simulated data sets reveal the accuracy and robustness of our approach against violations of the underlying assumptions and various degrees of data incompleteness. Finally, we demonstrate the application of our method with the diversification of the mammal family Rhinocerotidae and reveal a complex history of repeated and independent temporal shifts of both speciation and extinction rates, leading to the expansion and subsequent decline of the group. The estimated parameters of the birth-death process implemented here are directly comparable with those obtained from dated molecular phylogenies. Thus, our model represents a step towards integrating phylogenetic and fossil information to infer macroevolutionary processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study proposes a new concept for upscaling local information on failure surfaces derived from geophysical data, in order to develop the spatial information and quickly estimate the magnitude and intensity of a landslide. A new vision of seismic interpretation on landslides is also demonstrated by taking into account basic geomorphic information with a numeric method based on the Sloping Local Base Level (SLBL). The SLBL is a generalization of the base level defined in geomorphology applied to landslides, and allows the calculation of the potential geometry of the landslide failure surface. This approach was applied to a large scale landslide formed mainly in gypsum and situated in a former glacial valley along the Rhone within the Western European Alps. Previous studies identified the existence of two sliding surfaces that may continue below the level of the valley. In this study. seismic refraction-reflexion surveys were carried out to verify the existence of these failure surfaces. The analysis of the seismic data provides a four-layer model where three velocity layers (<1000 ms(-1), 1500 ms(-1) and 3000 ms(-1)) are interpreted as the mobilized mass at different weathering levels and compaction. The highest velocity layer (>4000 ms(-1)) with a maximum depth of similar to 58 m is interpreted as the stable anhydrite bedrock. Two failure surfaces were interpreted from the seismic surveys: an upper failure and a much deeper one (respectively 25 and 50 m deep). The upper failure surface depth deduced from geophysics is slightly different from the results obtained using the SLBL, and the deeper failure surface depth calculated with the SLBL method is underestimated in comparison with the geophysical interpretations. Optimal results were therefore obtained by including the seismic data in the SLBL calculations according to the geomorphic limits of the landslide (maximal volume of mobilized mass = 7.5 x 10(6) m(3)).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the results owing to short and discontinuous periods of rainfall data with the result that probabilistic equations cannot be applied. The Manning method yields a wide range of results because of its dependence on the roughness coefficient. The paleohydraulic method yielded higher values than the rational and Manning methods. However, it should be pointed out that it is possible that bigger boulders could have been moved had they existed. These discharge values are lower than those obtained by the geomorphological estimates, i.e. much closer to reality. The flood hazard maps were derived from the comprehensive geomorphological map. Three categories of hazard were established (very high, high and moderate) using flood energy, water height and velocity flow deduced from geomorphological and eyewitness reports.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Avalanche forecasting is a complex process involving the assimilation of multiple data sources to make predictions over varying spatial and temporal resolutions. Numerically assisted forecasting often uses nearest neighbour methods (NN), which are known to have limitations when dealing with high dimensional data. We apply Support Vector Machines to a dataset from Lochaber, Scotland to assess their applicability in avalanche forecasting. Support Vector Machines (SVMs) belong to a family of theoretically based techniques from machine learning and are designed to deal with high dimensional data. Initial experiments showed that SVMs gave results which were comparable with NN for categorical and probabilistic forecasts. Experiments utilising the ability of SVMs to deal with high dimensionality in producing a spatial forecast show promise, but require further work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present here a nonbiased probabilistic method that allows us to consistently analyze knottedness of linear random walks with up to several hundred noncorrelated steps. The method consists of analyzing the spectrum of knots formed by multiple closures of the same open walk through random points on a sphere enclosing the walk. Knottedness of individual "frozen" configurations of linear chains is therefore defined by a characteristic spectrum of realizable knots. We show that in the great majority of cases this method clearly defines the dominant knot type of a walk, i.e., the strongest component of the spectrum. In such cases, direct end-to-end closure creates a knot that usually coincides with the knot type that dominates the random closure spectrum. Interestingly, in a very small proportion of linear random walks, the knot type is not clearly defined. Such walks can be considered as residing in a border zone of the configuration space of two or more knot types. We also characterize the scaling behavior of linear random knots.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this article is to treat a currently much debated issue, the effects of age on second language learning. To do so, we contrast data collected by our research team from over one thousand seven hundred young and adult learners with four popular beliefs or generalizations, which, while deeply rooted in this society, are not always corroborated by our data.Two of these generalizations about Second Language Acquisition (languages spoken in the social context) seem to be widely accepted: a) older children, adolescents and adults are quicker and more efficient at the first stages of learning than are younger learners; b) in a natural context children with an early start are more liable to attain higher levels of proficiency. However, in the context of Foreign Language Acquisition, the context in which we collect the data, this second generalization is difficult to verify due to the low number of instructional hours (a maximum of some 800 hours) and the lower levels of language exposure time provided. The design of our research project has allowed us to study differences observed with respect to the age of onset (ranging from 2 to 18+), but in this article we focus on students who began English instruction at the age of 8 (LOGSE Educational System) and those who began at the age of 11 (EGB). We have collected data from both groups after a period of 200 (Time 1) and 416 instructional hours (Time 2), and we are currently collecting data after a period of 726 instructional hours (Time 3). We have designed and administered a variety of tests: tests on English production and reception, both oral and written, and within both academic and communicative oriented approaches, on the learners' L1 (Spanish and Catalan), as well as a questionnaire eliciting personal and sociolinguistic information. The questions we address and the relevant empirical evidence are as follows: 1. "For young children, learning languages is a game. They enjoy it more than adults."Our data demonstrate that the situation is not quite so. Firstly, both at the levels of Primary and Secondary education (ranging from 70.5% in 11-year-olds to 89% in 14-year-olds) students have a positive attitude towards learning English. Secondly, there is a difference between the two groups with respect to the factors they cite as responsible for their motivation to learn English: the younger students cite intrinsic factors, such as the games they play, the methodology used and the teacher, whereas the older students cite extrinsic factors, such as the role of their knowledge of English in the achievement of their future professional goals. 2 ."Young children have more resources to learn languages." Here our data suggest just the opposite. The ability to employ learning strategies (actions or steps used) increases with age. Older learners' strategies are more varied and cognitively more complex. In contrast, younger learners depend more on their interlocutor and external resources and therefore have a lower level of autonomy in their learning. 3. "Young children don't talk much but understand a lot"This third generalization does seem to be confirmed, at least to a certain extent, by our data in relation to the analysis of differences due to the age factor and productive use of the target language. As seen above, the comparably slower progress of the younger learners is confirmed. Our analysis of interpersonal receptive abilities demonstrates as well the advantage of the older learners. Nevertheless, with respect to passive receptive activities (for example, simple recognition of words or sentences) no great differences are observed. Statistical analyses suggest that in this test, in contrast to the others analyzed, the dominance of the subjects' L1s (reflecting a cognitive capacity that grows with age) has no significant influence on the learning process. 4. "The sooner they begin, the better their results will be in written language"This is not either completely confirmed in our research. First of all, we perceive that certain compensatory strategies disappear only with age, but not with the number of instructional hours. Secondly, given an identical number of instructional hours, the older subjects obtain better results. With respect to our analysis of data from subjects of the same age (12 years old) but with a different number of instructional hours (200 and 416 respectively, as they began at the ages of 11 and 8), we observe that those who began earlier excel only in the area of lexical fluency. In conclusion, the superior rate of older learners appears to be due to their higher level of cognitive development, a factor which allows them to benefit more from formal or explicit instruction in the school context. Younger learners, however, do not benefit from the quantity and quality of linguistic exposure typical of a natural acquisition context in which they would be allowed to make use of implicit learning abilities. It seems clear, then, that the initiative in this country to begin foreign language instruction earlier will have positive effects only if it occurs in combination with either higher levels of exposure time to the foreign language, or, alternatively, with its use as the language of instruction in other areas of the curriculum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transfer of tumor antigen-specific T-cell receptors (TCRs) into human T cells aims at redirecting their cytotoxicity toward tumors. Efficacy and safety may be affected by pairing of natural and introduced TCRalpha/beta chains potentially leading to autoimmunity. We hypothesized that a novel single-chain (sc)TCR framework relying on the coexpression of the TCRalpha constant alpha (Calpha) domain would prevent undesired pairing while preserving structural and functional similarity to a fully assembled double-chain (dc)TCR/CD3 complex. We confirmed this hypothesis for a murine p53-specific scTCR. Substantial effector function was observed only in the presence of a murine Calpha domain preceded by a TCRalpha signal peptide for shuttling to the cell membrane. The generalization to a human gp100-specific TCR required the murinization of both C domains. Structural and functional T-cell avidities of an accessory disulfide-linked scTCR gp100/Calpha were higher than those of a dcTCR. Antigen-dependent phosphorylation of the proximal effector zeta-chain-associated protein kinase 70 at tyrosine 319 was not impaired, reflecting its molecular integrity in signaling. In melanoma-engrafted nonobese diabetic/severe combined immunodeficient mice, adoptive transfer of scTCR gp100/Calpha transduced T cells conferred superior delay in tumor growth among primary and long-term secondary tumor challenges. We conclude that the novel scTCR constitutes a reliable means to immunotherapeutically target hematologic malignancies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil slope instability concerning highway infrastructure is an ongoing problem in Iowa, as slope failures endanger public safety and continue to result in costly repair work. Characterization of slope failures is complicated, because the factors affecting slope stability can be difficult to discern and measure, particularly soil shear strength parameters. While in the past extensive research has been conducted on slope stability investigations and analysis, this research consists of field investigations addressing both the characterization and reinforcement of such slope failures. The current research focuses on applying an infrequently-used testing technique comprised of the Borehole Shear Test (BST). This in-situ test rapidly provides effective (i.e., drained) shear strength parameter values of soil. Using the BST device, fifteen Iowa slopes (fourteen failures and one proposed slope) were investigated and documented. Particular attention was paid to highly weathered shale and glacial till soil deposits, which have both been associated with slope failures in the southern Iowa drift region. Conventional laboratory tests including direct shear tests, triaxial compression tests, and ring shear tests were also performed on undisturbed and reconstituted soil samples to supplement BST results. The shear strength measurements were incorporated into complete evaluations of slope stability using both limit equilibrium and probabilistic analyses. The research methods and findings of these investigations are summarized in Volume 1 of this report. Research details of the independent characterization and reinforcement investigations are provided in Volumes 2 and 3, respectively. Combined, the field investigations offer guidance on identifying the factors that affect slope stability at a particular location and also on designing slope reinforcement using pile elements for cases where remedial measures are necessary. The research findings are expected to benefit civil and geotechnical engineers of government transportation agencies, consultants, and contractors dealing with slope stability, slope remediation, and geotechnical testing in Iowa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil slope instability concerning highway infrastructure is an ongoing problem in Iowa, as slope failures endanger public safety and continue to result in costly repair work. While in the past extensive research has been conducted on slope stability investigations and analysis, this current research study consists of field investigations addressing both the characterization and reinforcement of such slope failures. While Volume I summarizes the research methods and findings of this study, Volume II provides procedural details for incorporating an infrequently-used testing technique, borehole shear tests, into practice. Fifteen slopes along Iowa highways were investigated, including thirteen slides (failed slopes), one unfailed slope, and one proposed embankment slope (the Sugar Creek Project). The slopes are mainly comprised of either clay shale or glacial till, and are generally gentle and of small scale, with slope angle ranging from 11 deg to 23 deg and height ranging from 6 to 23 m. Extensive field investigations and laboratory tests were performed for each slope. Field investigations included survey of slope geometry, borehole drilling, soil sampling, in-situ Borehole Shear Testing (BST) and ground water table measurement. Laboratory investigations mainly comprised of ring shear tests, soil basic property tests (grain size analysis and Atterberg limits test), mineralogy analyses, soil classifications, and natural water contents and density measurements on the representative soil samples from each slope. Extensive direct shear tests and a few triaxial compression tests and unconfined compression tests were also performed on undisturbed soil samples for the Sugar Creek Project. Based on the results of field and lab investigations, slope stability analysis was performed on each of the slopes to determine the possible factors resulting in the slope failures or to evaluate the potential slope instabilities using limit equilibrium methods. Deterministic slope analyses were performed for all the slopes. Probabilistic slope analysis and sensitivity study were also performed for the slope of the Sugar Creek Project. Results indicate that while the in-situ test rapidly provides effective shear strength parameters of soils, some training may be required for effective and appropriate use of the BST. Also, it is primarily intended to test cohesive soils and can produce erroneous results in gravelly soils. Additionally, the quality of boreholes affects test results, and disturbance to borehole walls should be minimized before test performance. A final limitation of widespread borehole shear testing may be its limited availability, as only about four to six test devices are currently being used in Iowa. Based on the data gathered in the field testing, reinforcement investigations are continued in Volume III.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article presents some of the results of a research project whose main objective was to analyse the relationship between resilience and school success of students of foreign origin at a time of special academic vulnerability: the transition from compulsory secondary education (ESO) to post-compulsory education (PO). By non-probabilistic-incidental sampling, the study was conducted in four schools of Barcelona and province, with 94 participants from 15 to 18 years old. This research of a longitudinal design had five phases combining strategies to collect qualitative and quantitative data. The results presented here are the result of the SV-RES adapted scale and two other self-produced toolsa center scale and a general questionnaire. The results obtained have confirmed the fulfilment of the hypothesis: immigrant students who manage to persevere in post-compulsory stages (Spanish Baccalaureate and/or Vocational Training) show higher levels of resilience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The success of combination antiretroviral therapy is limited by the evolutionary escape dynamics of HIV-1. We used Isotonic Conjunctive Bayesian Networks (I-CBNs), a class of probabilistic graphical models, to describe this process. We employed partial order constraints among viral resistance mutations, which give rise to a limited set of mutational pathways, and we modeled phenotypic drug resistance as monotonically increasing along any escape pathway. Using this model, the individualized genetic barrier (IGB) to each drug is derived as the probability of the virus not acquiring additional mutations that confer resistance. Drug-specific IGBs were combined to obtain the IGB to an entire regimen, which quantifies the virus' genetic potential for developing drug resistance under combination therapy. The IGB was tested as a predictor of therapeutic outcome using between 2,185 and 2,631 treatment change episodes of subtype B infected patients from the Swiss HIV Cohort Study Database, a large observational cohort. Using logistic regression, significant univariate predictors included most of the 18 drugs and single-drug IGBs, the IGB to the entire regimen, the expert rules-based genotypic susceptibility score (GSS), several individual mutations, and the peak viral load before treatment change. In the multivariate analysis, the only genotype-derived variables that remained significantly associated with virological success were GSS and, with 10-fold stronger association, IGB to regimen. When predicting suppression of viral load below 400 cps/ml, IGB outperformed GSS and also improved GSS-containing predictors significantly, but the difference was not significant for suppression below 50 cps/ml. Thus, the IGB to regimen is a novel data-derived predictor of treatment outcome that has potential to improve the interpretation of genotypic drug resistance tests.