978 resultados para Advanced Transaction Models
Resumo:
ABSTRACT (FRENCH)Ce travail de thèse basé sur le système visuel chez les sujets sains et chez les patients schizophrènes, s'articule autour de trois articles scientifiques publiés ou en cours de publication. Ces articles traitent des sujets suivants : le premier article présente une nouvelle méthode de traitement des composantes physiques des stimuli (luminance et fréquence spatiale). Le second article montre, à l'aide d'analyses de données EEG, un déficit de la voie magnocellulaire dans le traitement visuel des illusions chez les patients schizophrènes. Ceci est démontré par l'absence de modulation de la composante PI chez les patients schizophrènes contrairement aux sujets sains. Cette absence est induite par des stimuli de type illusion Kanizsa de différentes excentricités. Finalement, le troisième article, également à l'aide de méthodes de neuroimagerie électrique (EEG), montre que le traitement des contours illusoires se trouve dans le complexe latéro-occipital (LOC), à l'aide d'illusion « misaligned gratings ». De plus il révèle que les activités démontrées précédemment dans les aires visuelles primaires sont dues à des inférences « top- down ».Afin de permettre la compréhension de ces trois articles, l'introduction de ce manuscrit présente les concepts essentiels. De plus des méthodes d'analyses de temps-fréquence sont présentées. L'introduction est divisée en quatre parties : la première présente le système visuel depuis les cellules retino-corticales aux deux voix du traitement de l'information en passant par les régions composant le système visuel. La deuxième partie présente la schizophrénie par son diagnostic, ces déficits de bas niveau de traitement des stimuli visuel et ces déficits cognitifs. La troisième partie présente le traitement des contours illusoires et les trois modèles utilisés dans le dernier article. Finalement, les méthodes de traitement des données EEG seront explicitées, y compris les méthodes de temps-fréquences.Les résultats des trois articles sont présentés dans le chapitre éponyme (du même nom). De plus ce chapitre comprendra les résultats obtenus à l'aide des méthodes de temps-fréquenceFinalement, la discussion sera orientée selon trois axes : les méthodes de temps-fréquence ainsi qu'une proposition de traitement de ces données par une méthode statistique indépendante de la référence. La discussion du premier article en montrera la qualité du traitement de ces stimuli. La discussion des deux articles neurophysiologiques, proposera de nouvelles d'expériences afin d'affiner les résultats actuels sur les déficits des schizophrènes. Ceci pourrait permettre d'établir un marqueur biologique fiable de la schizophrénie.ABSTRACT (ENGLISH)This thesis focuses on the visual system in healthy subjects and schizophrenic patients. To address this research, advanced methods of analysis of electroencephalographic (EEG) data were used and developed. This manuscript is comprised of three scientific articles. The first article showed a novel method to control the physical features of visual stimuli (luminance and spatial frequencies). The second article showed, using electrical neuroimaging of EEG, a deficit in spatial processing associated with the dorsal pathway in chronic schizophrenic patients. This deficit was elicited by an absent modulation of the PI component in terms of response strength and topography as well as source estimations. This deficit was orthogonal to the preserved ability to process Kanizsa-type illusory contours. Finally, the third article resolved ongoing debates concerning the neural mechanism mediating illusory contour sensitivity by using electrical neuroimaging to show that the first differentiation of illusory contour presence vs. absence is localized within the lateral occipital complex. This effect was subsequent to modulations due to the orientation of misaligned grating stimuli. Collectively, these results support a model where effects in V1/V2 are mediated by "top-down" modulation from the LOC.To understand these three articles, the Introduction of this thesis presents the major concepts used in these articles. Additionally, a section is devoted to time-frequency analysis methods not presented in the articles themselves. The introduction is divided in four parts. The first part presents three aspects of the visual system: cellular, regional, and its functional interactions. The second part presents an overview of schizophrenia and its sensoiy-cognitive deficits. The third part presents an overview of illusory contour processing and the three models examined in the third article. Finally, advanced analysis methods for EEG are presented, including time- frequency methodology.The Introduction is followed by a synopsis of the main results in the articles as well as those obtained from the time-frequency analyses.Finally, the Discussion chapter is divided along three axes. The first axis discusses the time frequency analysis and proposes a novel statistical approach that is independent of the reference. The second axis contextualizes the first article and discusses the quality of the stimulus control and direction for further improvements. Finally, both neurophysiologic articles are contextualized by proposing future experiments and hypotheses that may serve to improve our understanding of schizophrenia on the one hand and visual functions more generally.
Resumo:
An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001.We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling.
Resumo:
OBJECTIVE: To better understand the structure of the Patient Assessment of Chronic Illness Care (PACIC) instrument. More specifically to test all published validation models, using one single data set and appropriate statistical tools. DESIGN: Validation study using data from cross-sectional survey. PARTICIPANTS: A population-based sample of non-institutionalized adults with diabetes residing in Switzerland (canton of Vaud). MAIN OUTCOME MEASURE: French version of the 20-items PACIC instrument (5-point response scale). We conducted validation analyses using confirmatory factor analysis (CFA). The original five-dimension model and other published models were tested with three types of CFA: based on (i) a Pearson estimator of variance-covariance matrix, (ii) a polychoric correlation matrix and (iii) a likelihood estimation with a multinomial distribution for the manifest variables. All models were assessed using loadings and goodness-of-fit measures. RESULTS: The analytical sample included 406 patients. Mean age was 64.4 years and 59% were men. Median of item responses varied between 1 and 4 (range 1-5), and range of missing values was between 5.7 and 12.3%. Strong floor and ceiling effects were present. Even though loadings of the tested models were relatively high, the only model showing acceptable fit was the 11-item single-dimension model. PACIC was associated with the expected variables of the field. CONCLUSIONS: Our results showed that the model considering 11 items in a single dimension exhibited the best fit for our data. A single score, in complement to the consideration of single-item results, might be used instead of the five dimensions usually described.
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.
Resumo:
PURPOSE: Aerodynamic drag plays an important role in performance for athletes practicing sports that involve high-velocity motions. In giant slalom, the skier is continuously changing his/her body posture, and this affects the energy dissipated in aerodynamic drag. It is therefore important to quantify this energy to understand the dynamic behavior of the skier. The aims of this study were to model the aerodynamic drag of alpine skiers in giant slalom simulated conditions and to apply these models in a field experiment to estimate energy dissipated through aerodynamic drag. METHODS: The aerodynamic characteristics of 15 recreational male and female skiers were measured in a wind tunnel while holding nine different skiing-specific postures. The drag and the frontal area were recorded simultaneously for each posture. Four generalized and two individualized models of the drag coefficient were built, using different sets of parameters. These models were subsequently applied in a field study designed to compare the aerodynamic energy losses between a dynamic and a compact skiing technique. RESULTS: The generalized models estimated aerodynamic drag with an accuracy of between 11.00% and 14.28%, and the individualized models estimated aerodynamic drag with an accuracy between 4.52% and 5.30%. The individualized model used for the field study showed that using a dynamic technique led to 10% more aerodynamic drag energy loss than using a compact technique. DISCUSSION: The individualized models were capable of discriminating different techniques performed by advanced skiers and seemed more accurate than the generalized models. The models presented here offer a simple yet accurate method to estimate the aerodynamic drag acting upon alpine skiers while rapidly moving through the range of positions typical to turning technique.
Resumo:
We show that a simple mixing idea allows one to establish a number of explicit formulas for ruin probabilities and related quantities in collective risk models with dependence among claim sizes and among claim inter-occurrence times. Examples include compound Poisson risk models with completely monotone marginal claim size distributions that are dependent according to Archimedean survival copulas as well as renewal risk models with dependent inter-occurrence times.
Resumo:
Two different approaches currently prevail for predicting spatial patterns of species assemblages. The first approach (macroecological modelling, MEM) focuses directly on realised properties of species assemblages, whereas the second approach (stacked species distribution modelling, S-SDM) starts with constituent species to approximate assemblage properties. Here, we propose to unify the two approaches in a single 'spatially-explicit species assemblage modelling' (SESAM) framework. This framework uses relevant species source pool designations, macroecological factors, and ecological assembly rules to constrain predictions of the richness and composition of species assemblages obtained by stacking predictions of individual species distributions. We believe that such a framework could prove useful in many theoretical and applied disciplines of ecology and evolution, both for improving our basic understanding of species assembly across spatio-temporal scales and for anticipating expected consequences of local, regional or global environmental changes. In this paper, we propose such a framework and call for further developments and testing across a broad range of community types in a variety of environments.
Resumo:
We propose a method to evaluate cyclical models which does not require knowledge of the DGP and the exact empirical specification of the aggregate decision rules. We derive robust restrictions in a class of models; use some to identify structural shocks and others to evaluate the model or contrast sub-models. The approach has good size and excellent power properties, even in small samples. We show how to examine the validity of a class of models, sort out the relevance of certain frictions, evaluate the importance of an added feature, and indirectly estimate structural parameters.
Resumo:
PURPOSE: From February 2001 to February 2002, 946 patients with advanced GI stromal tumors (GISTs) treated with imatinib were included in a controlled EORTC/ISG/AGITG (European Organisation for Research and Treatment of Cancer/Italian Sarcoma Group/Australasian Gastro-Intestinal Trials Group) trial. This analysis investigates whether the response classification assessed by RECIST (Response Evaluation Criteria in Solid Tumors), predicts for time to progression (TTP) and overall survival (OS). PATIENTS AND METHODS: Per protocol, the first three disease assessments were done at 2, 4, and 6 months. For the purpose of the analysis (landmark method), disease response was subclassified in six categories: partial response (PR; > 30% size reduction), minor response (MR; 10% to 30% reduction), no change (NC) as either NC- (0% to 10% reduction) or NC+ (0% to 20% size increase), progressive disease (PD; > 20% increase/new lesions), and subjective PD (clinical progression). RESULTS: A total of 906 patients had measurable disease at entry. At all measurement time points, complete response (CR), PR, and MR resulted in similar TTP and OS; this was also true for NC- and NC+, and for PD and subjective PD. Patients were subsequently classified as responders (CR/PR/MR), NC (NC+/NC-), or PD. This three-class response categorization was found to be highly predictive of further progression or survival for the first two measurement points. After 6 months of imatinib, responders (CR/PR/MR) had the same survival prognosis as patients classified as NC. CONCLUSION: RECIST perfectly enables early discrimination between patients who benefited long term from imatinib and those who did not. After 6 months of imatinib, if the patient is not experiencing PD, the pattern of radiologic response by tumor size criteria has no prognostic value for further outcome. Imatinib needs to be continued as long as there is no progression according to RECIST.
Resumo:
The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.
Resumo:
This paper breaks new ground toward contractual and institutional innovation in models of homeownership, equity building, and mortgage enforcement. Inspired by recent developments in the affordable housing sector and in other types of public financing schemes, this paper suggests extending institutional and financial strategies such as timeand place-based division of property rights, conditional subsidies, and credit mediation to alleviate the systemic risks of mortgage foreclosure. Alongside a for-profit shared equity scheme that would be led by local governments, we also outline a private market shared equity model, one of bootstrapping home buying with purchase options.
Resumo:
Since ethical concerns are calling for more attention within Operational Research, we present three approaches to combine Operational Research models with ethics. Our intention is to clarify the trade-offs faced by the OR community, in particular the tension between the scientific legitimacy of OR models (ethics outside OR models) and the integration of ethics within models (ethics within OR models). Presenting and discussing an approach that combines OR models with the process of OR (ethics beyond OR models), we suggest rigorous ways to express the relation between ethics and OR models. As our work is exploratory, we are trying to avoid a dogmatic attitude and call for further research. We argue that there are interesting avenues for research at the theoretical, methodological and applied levels and that the OR community can contribute to an innovative, constructive and responsible social dialogue about its ethics.
Resumo:
Diagnosis Related Groups (DRG) are frequently used to standardize the comparison of consumption variables, such as length of stay (LOS). In order to be reliable, this comparison must control for the presence of outliers, i.e. values far removed from the pattern set by the majority of the data. Indeed, outliers can distort the usual statistical summaries, such as means and variances. A common practice is to trim LOS values according to various empirical rules, but there is little theoretical support for choosing between alternative procedures. This pilot study explores the possibility of describing LOS distributions with parametric models which provide the necessary framework for the use of robust methods.
Resumo:
This paper proposes a method to conduct inference in panel VAR models with cross unit interdependencies and time variations in the coefficients. The approach can be used to obtain multi-unit forecasts and leading indicators and to conduct policy analysis in a multiunit setups. The framework of analysis is Bayesian and MCMC methods are used to estimate the posterior distribution of the features of interest. The model is reparametrized to resemble an observable index model and specification searches are discussed. As an example, we construct leading indicators for inflation and GDP growth in the Euro area using G-7 information.