119 resultados para Data Driven Modeling


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The investigation of perceptual and cognitive functions with non-invasive brain imaging methods critically depends on the careful selection of stimuli for use in experiments. For example, it must be verified that any observed effects follow from the parameter of interest (e.g. semantic category) rather than other low-level physical features (e.g. luminance, or spectral properties). Otherwise, interpretation of results is confounded. Often, researchers circumvent this issue by including additional control conditions or tasks, both of which are flawed and also prolong experiments. Here, we present some new approaches for controlling classes of stimuli intended for use in cognitive neuroscience, however these methods can be readily extrapolated to other applications and stimulus modalities. Our approach is comprised of two levels. The first level aims at equalizing individual stimuli in terms of their mean luminance. Each data point in the stimulus is adjusted to a standardized value based on a standard value across the stimulus battery. The second level analyzes two populations of stimuli along their spectral properties (i.e. spatial frequency) using a dissimilarity metric that equals the root mean square of the distance between two populations of objects as a function of spatial frequency along x- and y-dimensions of the image. Randomized permutations are used to obtain a minimal value between the populations to minimize, in a completely data-driven manner, the spectral differences between image sets. While another paper in this issue applies these methods in the case of acoustic stimuli (Aeschlimann et al., Brain Topogr 2008), we illustrate this approach here in detail for complex visual stimuli.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Self-consciousness has mostly been approached by philosophical enquiry and not by empirical neuroscientific study, leading to an overabundance of diverging theories and an absence of data-driven theories. Using robotic technology, we achieved specific bodily conflicts and induced predictable changes in a fundamental aspect of self-consciousness by altering where healthy subjects experienced themselves to be (self-location). Functional magnetic resonance imaging revealed that temporo-parietal junction (TPJ) activity reflected experimental changes in self-location that also depended on the first-person perspective due to visuo-tactile and visuo-vestibular conflicts. Moreover, in a large lesion analysis study of neurological patients with a well-defined state of abnormal self-location, brain damage was also localized at TPJ, providing causal evidence that TPJ encodes self-location. Our findings reveal that multisensory integration at the TPJ reflects one of the most fundamental subjective feelings of humans: the feeling of being an entity localized at a position in space and perceiving the world from this position and perspective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The subthalamic nucleus (STN) is a small, glutamatergic nucleus situated in the diencephalon. A critical component of normal motor function, it has become a key target for deep brain stimulation in the treatment of Parkinson's disease. Animal studies have demonstrated the existence of three functional sub-zones but these have never been shown conclusively in humans. In this work, a data driven method with diffusion weighted imaging demonstrated that three distinct clusters exist within the human STN based on brain connectivity profiles. The STN was successfully sub-parcellated into these regions, demonstrating good correspondence with that described in the animal literature. The local connectivity of each sub-region supported the hypothesis of bilateral limbic, associative and motor regions occupying the anterior, mid and posterior portions of the nucleus respectively. This study is the first to achieve in-vivo, non-invasive anatomical parcellation of the human STN into three anatomical zones within normal diagnostic scan times, which has important future implications for deep brain stimulation surgery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A methodology of exploratory data analysis investigating the phenomenon of orographic precipitation enhancement is proposed. The precipitation observations obtained from three Swiss Doppler weather radars are analysed for the major precipitation event of August 2005 in the Alps. Image processing techniques are used to detect significant precipitation cells/pixels from radar images while filtering out spurious effects due to ground clutter. The contribution of topography to precipitation patterns is described by an extensive set of topographical descriptors computed from the digital elevation model at multiple spatial scales. Additionally, the motion vector field is derived from subsequent radar images and integrated into a set of topographic features to highlight the slopes exposed to main flows. Following the exploratory data analysis with a recent algorithm of spectral clustering, it is shown that orographic precipitation cells are generated under specific flow and topographic conditions. Repeatability of precipitation patterns in particular spatial locations is found to be linked to specific local terrain shapes, e.g. at the top of hills and on the upwind side of the mountains. This methodology and our empirical findings for the Alpine region provide a basis for building computational data-driven models of orographic enhancement and triggering of precipitation. Copyright (C) 2011 Royal Meteorological Society .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A ubiquitous assessment of swimming velocity (main metric of the performance) is essential for the coach to provide a tailored feedback to the trainee. We present a probabilistic framework for the data-driven estimation of the swimming velocity at every cycle using a low-cost wearable inertial measurement unit (IMU). The statistical validation of the method on 15 swimmers shows that an average relative error of 0.1 ± 9.6% and high correlation with the tethered reference system (rX,Y=0.91 ) is achievable. Besides, a simple tool to analyze the influence of sacrum kinematics on the performance is provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Functional connectivity (FC) as measured by correlation between fMRI BOLD time courses of distinct brain regions has revealed meaningful organization of spontaneous fluctuations in the resting brain. However, an increasing amount of evidence points to non-stationarity of FC; i.e., FC dynamically changes over time reflecting additional and rich information about brain organization, but representing new challenges for analysis and interpretation. Here, we propose a data-driven approach based on principal component analysis (PCA) to reveal hidden patterns of coherent FC dynamics across multiple subjects. We demonstrate the feasibility and relevance of this new approach by examining the differences in dynamic FC between 13 healthy control subjects and 15 minimally disabled relapse-remitting multiple sclerosis patients. We estimated whole-brain dynamic FC of regionally-averaged BOLD activity using sliding time windows. We then used PCA to identify FC patterns, termed "eigenconnectivities", that reflect meaningful patterns in FC fluctuations. We then assessed the contributions of these patterns to the dynamic FC at any given time point and identified a network of connections centered on the default-mode network with altered contribution in patients. Our results complement traditional stationary analyses, and reveal novel insights into brain connectivity dynamics and their modulation in a neurodegenerative disease.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: To develop a provisional definition for the evaluation of response to therapy in juvenile dermatomyositis (DM) based on the Paediatric Rheumatology International Trials Organisation juvenile DM core set of variables. METHODS: Thirty-seven experienced pediatric rheumatologists from 27 countries achieved consensus on 128 difficult patient profiles as clinically improved or not improved using a stepwise approach (patient's rating, statistical analysis, definition selection). Using the physicians' consensus ratings as the "gold standard measure," chi-square, sensitivity, specificity, false-positive and-negative rates, area under the receiver operating characteristic curve, and kappa agreement for candidate definitions of improvement were calculated. Definitions with kappa values >0.8 were multiplied by the face validity score to select the top definitions. RESULTS: The top definition of improvement was at least 20% improvement from baseline in 3 of 6 core set variables with no more than 1 of the remaining worsening by more than 30%, which cannot be muscle strength. The second-highest scoring definition was at least 20% improvement from baseline in 3 of 6 core set variables with no more than 2 of the remaining worsening by more than 25%, which cannot be muscle strength (definition P1 selected by the International Myositis Assessment and Clinical Studies group). The third is similar to the second with the maximum amount of worsening set to 30%. This indicates convergent validity of the process. CONCLUSION: We propose a provisional data-driven definition of improvement that reflects well the consensus rating of experienced clinicians, which incorporates clinically meaningful change in core set variables in a composite end point for the evaluation of global response to therapy in juvenile DM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cette thèse cible l'étude de la structure thermique de la croûte supérieure (<10km) dans les arcs magmatiques continentaux, et son influence sur l'enregistrement thermochronologique de leur exhumation et de leur évolution topographique. Nous portons notre regard sur deux chaînes de montagne appartenant aux Cordillères Américaines : Les Cascades Nord (USA) et la zone de faille Motagua (Guatemala). L'approche utilisée est axée sur la thermochronologie (U-Th-Sm)/He sur apatite et zircon, couplée avec la modélisation numérique de la structure thermique de la croûte. Nous mettons en évidence la variabilité à la fois spatiale et temporelle du gradient géothermique, et attirons l'attention du lecteur sur l'importance de prendre en compte la multitude des processus géologiques perturbant la structure thermique dans les chaînes de type cordillère, c'est à dire formées lors de la subduction océanique sous un continent.Une nouvelle approche est ainsi développée pour étudier et contraindre la perturbation thermique autour des chambres magmatiques. Deux profiles âge-elevation (U-Th-Sm)/He sur apatite et zircon, ont été collectées 7 km au sud du batholithe de Chilliwack, Cascades Nord. Les résultats montrent une variabilité spatiale et temporelle du gradient géothermique lors de l'emplacement magmatique qui peut être contrainte et séparé de l'exhumation. Durant l'emplacement de l'intrusion, la perturbation thermique y atteint un état d'équilibre (-80-100 °C/km) qui est fonction du flux de magma et de ia distance à la source du magma, puis rejoint 40 °C/km à la fin du processus d'emplacement magmatique.Quelques nouvelles données (U-Th)/He, replacées dans une compilation des données existantes dans les Cascades Nord, indiquent une vitesse d'exhumation constante (-100 m/Ma) dans le temps et l'espace entre 35 Ma et 2 Ma, associée à un soulèvement uniforme de la chaîne contrôlé par l'emplacement de magma dans la croûte durant toute l'activité de l'arc. Par contre, après ~2 Ma, le versant humide de la chaîne est affecté par une accélération des taux d'exhumation, jusqu'à 3 km de croûte y sont érodés. Les glaciations ont un triple effet sur l'érosion de cette chaîne: (1) augmentation des vitesses d'érosion, d'exhumation et de soulèvement la où les précipitations sont suffisantes, (2) limitation de l'altitude contrôlé par la position de Γ Ε LA, (3) élargissement du versant humide et contraction du versant aride de la chaîne.Les modifications des réseaux de drainage sont des processus de surface souvent sous-estimés au profil d'événements climatiques ou tectoniques. Nous proposons une nouvelle approche couplant une analyse géomorphologique, des données thermochronologiques de basse température ((U-Th-Sm)/He sur apatite et zircon), et l'utilisation de modélisation numérique thermo-cinématique pour les mettre en évidence et les dater; nous testons cette approche sur la gorge de la Skagit river dans les North Cascades.De nouvelles données (U-Th)/He sur zircons, complétant les données existantes, montrent que le déplacement horizontal le long de la faille transformante continentale Motagua, la limite des plaques Caraïbe/Amérique du Nord, a juxtaposé un bloc froid, le bloc Maya (s.s.), contre un bloque chaud, le bloc Chortis (s.s.) originellement en position d'arc. En plus de donner des gammes d'âges thermochronologiques très différents des deux côtés de la faille, le déplacement horizontal rapide (~2 cm/a) a produit un fort échange thermique latéral, résultant en un réchauffement du côté froid et un refroidissement du côté chaud de la zone de faille de Motagua.Enfin des données (U-Th-Sm)/He sur apatite témoignent d'un refroidissement Oligocène enregistré uniquement dans la croûte supérieure de la bordure nord de la zone de faille Motagua. Nous tenterons ultérieurement de reproduire ce découplage vertical de la structure thermique par la modélisation de la formation d'un bassin transtensif et de circulation de fluides le long de la faille de Motagua. - This thesis focuses on the influence of the dynamic thermal structure of the upper crust (<10km) on the thermochronologic record of the exhumational and topographic history of magmatic continental arcs. Two mountain belts from the American Cordillera are studied: the North Cascades (USA) and the Motagua fault zone (Guatemala). I use a combined approach coupling apatite and zircon (U-Th-Sm}/He thermochronology and thermo- kinematic numerical modelling. This study highlights the temporal and spatial variability of the geothermal gradient and the importance to take into account the different geological processes that perturb the thermal structure of Cordilleran-type mountain belts (i.e. mountain belts related to oceanic subduction underneath a continent}.We integrate apatite and zircon (U-Th)/He data with numerical thermo-kinematic models to study the relative effects of magmatic and surface processes on the thermal evolution of the crust and cooling patterns in the Cenozoic North Cascades arc (Washington State, USA). Two age-elevation profiles that are located 7 km south of the well-studied Chiliiwack intrusions shows that spatial and temporal variability in geothermal gradients linked to magma emplacement can be contrained and separated from exhumation processes. During Chiliiwack batholith emplacement at -35-20 Ma, the geothermal gradient of the country rocks increased to a very high steady-state value (80-100°C/km), which is likely a function of magma flux and the distance from the magma source area. Including temporally varying geothermal gradients in the analysis allows quantifying the thermal perturbation around magmatic intrusions and retrieving a relatively simple denudation history from the data.The synthesis of new and previously published (U-Th)/He data reveals that denudation of the Northern Cascades is spatially and temporally constant at -100 m/Ma between ~32 and ~2 Ma, which likely reflects uplift due to magmatic crustal thickening since the initiation of the Cenozoic stage of the continental magmatic arc. In contrast, the humid flank of the North Cascades is affected by a ten-fold acceleration in exhumation rate at ~2 Ma, which we interpret as forced by the initiation of glaciations; around 3 km of crust have been eroded since that time. Glaciations have three distinct effects on the dynamics of this mountain range: (1) they increase erosion, exhumation and uplift rates where precipitation rates are sufficient to drive efficient glacial erosion; (2) they efficiently limit the elevation of the range; (3) they lead to widening of the humid flank and contraction of the arid flank of the belt.Drainage reorganizations constitute an important agent of landscape evolution that is often underestimated to the benefit of tectonic or climatic events. We propose a new method that integrates geomorphology, low-temperature thermochronometry (apatite and zircon {U-Th-Sm)/He), and 3D numerical thermal-kinematic modelling to detect and date drainage instability producing recent gorge incision, and apply this approach to the Skagit River Gorge, North Cascades.Two zircon (U-Th)/He age-elevation profiles sampled on both sides of the Motagua Fault Zone (MFZ), the boundary between the North American and the Caribbean plates, combined with published thermochronological data show that strike-slip displacement has juxtaposed the cold Maya block (s.s.) against the hot, arc derived, Chortis block (s.s ), producing different age patterns on both sides of the fault and short-wavelength lateral thermal exchange, resulting in recent heating of the cool side and cooling of the hot side of the MFZ.Finally, an apatite (U-Th-Sm)/He age-elevation profile records rapid cooling at -35 Ma localized only in the upper crust along the northern side of the Motagua fault zone. We will try to reproduce these data by modeling the thermal perturbation resulting from the formation of a transtensional basin and of fluid flow activity along a crustal- scale strike-slip fault.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detecting local differences between groups of connectomes is a great challenge in neuroimaging, because the large number of tests that have to be performed and the impact on multiplicity correction. Any available information should be exploited to increase the power of detecting true between-group effects. We present an adaptive strategy that exploits the data structure and the prior information concerning positive dependence between nodes and connections, without relying on strong assumptions. As a first step, we decompose the brain network, i.e., the connectome, into subnetworks and we apply a screening at the subnetwork level. The subnetworks are defined either according to prior knowledge or by applying a data driven algorithm. Given the results of the screening step, a filtering is performed to seek real differences at the node/connection level. The proposed strategy could be used to strongly control either the family-wise error rate or the false discovery rate. We show by means of different simulations the benefit of the proposed strategy, and we present a real application of comparing connectomes of preschool children and adolescents.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigated the spatial, spectral, temporal and functional proprieties of functional brain connections involved in the concurrent execution of unrelated visual perception and working memory tasks. Electroencephalography data was analysed using a novel data-driven approach assessing source coherence at the whole-brain level. Three connections in the beta-band (18-24 Hz) and one in the gamma-band (30-40 Hz) were modulated by dual-task performance. Beta-coherence increased within two dorsofrontal-occipital connections in dual-task conditions compared to the single-task condition, with the highest coherence seen during low working memory load trials. In contrast, beta-coherence in a prefrontal-occipital functional connection and gamma-coherence in an inferior frontal-occipitoparietal connection was not affected by the addition of the second task and only showed elevated coherence under high working memory load. Analysis of coherence as a function of time suggested that the dorsofrontal-occipital beta-connections were relevant to working memory maintenance, while the prefrontal-occipital beta-connection and the inferior frontal-occipitoparietal gamma-connection were involved in top-down control of concurrent visual processing. The fact that increased coherence in the gamma-connection, from low to high working memory load, was negatively correlated with faster reaction time on the perception task supports this interpretation. Together, these results demonstrate that dual-task demands trigger non-linear changes in functional interactions between frontal-executive and occipitoparietal-perceptual cortices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Among the types of remote sensing acquisitions, optical images are certainly one of the most widely relied upon data sources for Earth observation. They provide detailed measurements of the electromagnetic radiation reflected or emitted by each pixel in the scene. Through a process termed supervised land-cover classification, this allows to automatically yet accurately distinguish objects at the surface of our planet. In this respect, when producing a land-cover map of the surveyed area, the availability of training examples representative of each thematic class is crucial for the success of the classification procedure. However, in real applications, due to several constraints on the sample collection process, labeled pixels are usually scarce. When analyzing an image for which those key samples are unavailable, a viable solution consists in resorting to the ground truth data of other previously acquired images. This option is attractive but several factors such as atmospheric, ground and acquisition conditions can cause radiometric differences between the images, hindering therefore the transfer of knowledge from one image to another. The goal of this Thesis is to supply remote sensing image analysts with suitable processing techniques to ensure a robust portability of the classification models across different images. The ultimate purpose is to map the land-cover classes over large spatial and temporal extents with minimal ground information. To overcome, or simply quantify, the observed shifts in the statistical distribution of the spectra of the materials, we study four approaches issued from the field of machine learning. First, we propose a strategy to intelligently sample the image of interest to collect the labels only in correspondence of the most useful pixels. This iterative routine is based on a constant evaluation of the pertinence to the new image of the initial training data actually belonging to a different image. Second, an approach to reduce the radiometric differences among the images by projecting the respective pixels in a common new data space is presented. We analyze a kernel-based feature extraction framework suited for such problems, showing that, after this relative normalization, the cross-image generalization abilities of a classifier are highly increased. Third, we test a new data-driven measure of distance between probability distributions to assess the distortions caused by differences in the acquisition geometry affecting series of multi-angle images. Also, we gauge the portability of classification models through the sequences. In both exercises, the efficacy of classic physically- and statistically-based normalization methods is discussed. Finally, we explore a new family of approaches based on sparse representations of the samples to reciprocally convert the data space of two images. The projection function bridging the images allows a synthesis of new pixels with more similar characteristics ultimately facilitating the land-cover mapping across images.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we develop a data-driven methodology to characterize the likelihood of orographic precipitation enhancement using sequences of weather radar images and a digital elevation model (DEM). Geographical locations with topographic characteristics favorable to enforce repeatable and persistent orographic precipitation such as stationary cells, upslope rainfall enhancement, and repeated convective initiation are detected by analyzing the spatial distribution of a set of precipitation cells extracted from radar imagery. Topographic features such as terrain convexity and gradients computed from the DEM at multiple spatial scales as well as velocity fields estimated from sequences of weather radar images are used as explanatory factors to describe the occurrence of localized precipitation enhancement. The latter is represented as a binary process by defining a threshold on the number of cell occurrences at particular locations. Both two-class and one-class support vector machine classifiers are tested to separate the presumed orographic cells from the nonorographic ones in the space of contributing topographic and flow features. Site-based validation is carried out to estimate realistic generalization skills of the obtained spatial prediction models. Due to the high class separability, the decision function of the classifiers can be interpreted as a likelihood or susceptibility of orographic precipitation enhancement. The developed approach can serve as a basis for refining radar-based quantitative precipitation estimates and short-term forecasts or for generating stochastic precipitation ensembles conditioned on the local topography.