913 resultados para structure, analysis, modeling


Relevância:

40.00% 40.00%

Publicador:

Resumo:

P450 oxidoreductase (POR) is the obligate electron donor for microsomal cytochrome P450s and mutations in POR cause several metabolic disorders. We have modeled the structure of human P450 oxidoreductase by in silico amino acid replacements in the rat POR crystal structure. The rat POR has 94% homology with human POR and 38 amino acids were replaced to make its sequence identical to human POR. Several rounds of molecular dynamic simulations refined the model and removed structural clashes from side chain alterations of replaced amino acids. This approach has the advantage of keeping the cofactor contacts and structural features of the core enzyme intact which could not be achieved by homology based approaches. The final model from our approach was of high quality and compared well with experimentally determined structures of other PORs. This model will be used for analyzing the structural implications of mutations and polymorphisms in human POR.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In order to analyze software systems, it is necessary to model them. Static software models are commonly imported by parsing source code and related data. Unfortunately, building custom parsers for most programming languages is a non-trivial endeavour. This poses a major bottleneck for analyzing software systems programmed in languages for which importers do not already exist. Luckily, initial software models do not require detailed parsers, so it is possible to start analysis with a coarse-grained importer, which is then gradually refined. In this paper we propose an approach to "agile modeling" that exploits island grammars to extract initial coarse-grained models, parser combinators to enable gradual refinement of model importers, and various heuristics to recognize language structure, keywords and other language artifacts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Glucocorticoids (GC) are successfully applied in neonatology to improve lung maturation in preterm born babies. Animal studies show that GC can also impair lung development. In this investigation, we used a new approach based on digital image analysis. Microscopic images of lung parenchyma were skeletonised and the geometrical properties of the septal network characterised by analysing the 'skeletal' parameters. Inhibition of the process of alveolarisation after extensive administration of small doses of GC in newborn rats was confirmed by significant changes in the 'skeletal' parameters. The induced structural changes in the lung parenchyma were still present after 60 days in adult rats, clearly indicating a long lasting or even definitive impairment of lung development and maturation caused by GC. Conclusion: digital image analysis and skeletonisation proved to be a highly suited approach to assess structural changes in lung parenchyma.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The factorial validity of the SF-36 was evaluated using confirmatory factor analysis (CFA) methods, structural equation modeling (SEM), and multigroup structural equation modeling (MSEM). First, the measurement and structural model of the hypothesized SF-36 was explicated. Second, the model was tested for the validity of a second-order factorial structure, upon evidence of model misfit, determined the best-fitting model, and tested the validity of the best-fitting model on a second random sample from the same population. Third, the best-fitting model was tested for invariance of the factorial structure across race, age, and educational subgroups using MSEM.^ The findings support the second-order factorial structure of the SF-36 as proposed by Ware and Sherbourne (1992). However, the results suggest that: (a) Mental Health and Physical Health covary; (b) general mental health cross-loads onto Physical Health; (c) general health perception loads onto Mental Health instead of Physical Health; (d) many of the error terms are correlated; and (e) the physical function scale is not reliable across these two samples. This hierarchical factor pattern was replicated across both samples of health care workers, suggesting that the post hoc model fitting was not data specific. Subgroup analysis suggests that the physical function scale is not reliable across the "age" or "education" subgroups and that the general mental health scale path from Mental Health is not reliable across the "white/nonwhite" or "education" subgroups.^ The importance of this study is in the use of SEM and MSEM in evaluating sample data from the use of the SF-36. These methods are uniquely suited to the analysis of latent variable structures and are widely used in other fields. The use of latent variable models for self reported outcome measures has become widespread, and should now be applied to medical outcomes research. Invariance testing is superior to mean scores or summary scores when evaluating differences between groups. From a practical, as well as, psychometric perspective, it seems imperative that construct validity research related to the SF-36 establish whether this same hierarchical structure and invariance holds for other populations.^ This project is presented as three articles to be submitted for publication. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The article offers a systematic analysis of the comparative trajectory of international democratic change. In particular, it focuses on the resulting convergence or divergence of political systems, borrowing from the literatures on institutional change and policy convergence. To this end, political-institutional data in line with Arend Lijphart’s (1999, 2012) empirical theory of democracy for 24 developed democracies between 1945 and 2010 are analyzed. Heteroscedastic multilevel models allow for directly modeling the development of the variance of types of democracy over time, revealing information about convergence, and adding substantial explanations. The findings indicate that there has been a trend away from extreme types of democracy in single cases, but no unconditional trend of convergence can be observed. However, there are conditional processes of convergence. In particular, economic globalization and the domestic veto structure interactively influence democratic convergence.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the earliest accounts of duration perception by Karl von Vierordt implied a common process underlying the timing of intervals in the sub-second and the second range. To date, there are two major explanatory approaches for the timing of brief intervals: the Common Timing Hypothesis and the Distinct Timing Hypothesis. While the common timing hypothesis also proceeds from a unitary timing process, the distinct timing hypothesis suggests two dissociable, independent mechanisms for the timing of intervals in the sub-second and the second range, respectively. In the present paper, we introduce confirmatory factor analysis (CFA) to elucidate the internal structure of interval timing in the sub-second and the second range. Our results indicate that the assumption of two mechanisms underlying the processing of intervals in the second and the sub-second range might be more appropriate than the assumption of a unitary timing mechanism. In contrast to the basic assumption of the distinct timing hypothesis, however, these two timing mechanisms are closely associated with each other and share 77% of common variance. This finding suggests either a strong functional relationship between the two timing mechanisms or a hierarchically organized internal structure. Findings are discussed in the light of existing psychophysical and neurophysiological data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The most influential theoretical account in time psychophysics assumes the existence of a unitary internal clock based on neural counting. The distinct timing hypothesis, on the other hand, suggests an automatic timing mechanism for processing of durations in the sub-second range and a cognitively controlled timing mechanism for processing of durations in the range of seconds. Although several psychophysical approaches can be applied for identifying the internal structure of interval timing in the second and sub-second range, the existing data provide a puzzling picture of rather inconsistent results. In the present chapter, we introduce confirmatory factor analysis (CFA) to further elucidate the internal structure of interval timing performance in the sub-second and second range. More specifically, we investigated whether CFA would rather support the notion of a unitary timing mechanism or of distinct timing mechanisms underlying interval timing in the sub-second and second range, respectively. The assumption of two distinct timing mechanisms which are completely independent of each other was not supported by our data. The model assuming a unitary timing mechanism underlying interval timing in both the sub-second and second range fitted the empirical data much better. Eventually, we also tested a third model assuming two distinct, but functionally related mechanisms. The correlation between the two latent variables representing the hypothesized timing mechanisms was rather high and comparison of fit indices indicated that the assumption of two associated timing mechanisms described the observed data better than only one latent variable. Models are discussed in the light of the existing psychophysical and neurophysiological data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The results of Eurosceptic parties in the recent European parliament election provide further evidence that the “permissive consensus” on European integration blurred. This paper focuses on the structure of the debate on EU integration issues. Which EU integration issues and positions do parties put forward? Can the debate on EU integration issues be subsumed in one or several dimensions? Do they reflect national political conflicts such as the left-right and the ‘new politics’/cultural divide? Or do they form one unique or several EU-specific dimensions, e.g. national sovereignty versus integration? In order to address these questions, this paper departs from the assumption that debate on European integration is multidimensional in its nature and therefore entails a multitude of issue areas. In other words, it does not look at how socio-economic and cultural issues are related to European integration but focuses on its components, i.e. particular EU-specific policies such as EU-wide employment, environment, immigration and monetary policy. The paper departs from the cleavage theory on political di-visions and different approaches transferring them to EU politics. Two points should be noted; first, this paper does not compare the debate on European integration issues between the national level and the EU level, but whether domestic divisions are reflected at the EU level. Second, it is not concerned with the general ideo-logical profile of political parties on EU integration issues, but on EU issues that parties communicated through press releases. By doing this, the paper is concerned with the salient EU issues that parties touch upon.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Play has been proposed as a promising indicator of positive animal welfare. We aimed to study play in rats across contexts (conspecific/heterospecific) and types (social: pinning, being pinned; solitary: scampering), and we investigated its structure using behavioral sequence analysis. Group-housed (three per cage) adolescent male Lister Hooded rats (n = 21) were subjected to a Play-In-Pairs test: after a 3 hour isolation period, a pair of cage-mates was returned to the home cage and both social and solitary play were scored for 20 min. This procedure was repeated for each pair combination across three consecutive days, and individual play scores were calculated. Heterospecific play was measured using a Tickling test: rats were individually tickled by the experimenter through bouts of gentle, rapid finger movements on their underside, and the number of positive 50 kHz frequency modulated vocalizations and experimenter-directed approach behaviors were recorded. Both of the above tests were compared with social play in the home cage. While conspecific play in both the Play-In-Pairs test and home cage were correlated, both seemed to be unrelated to heterospecific play in the Tickling test. During the Play-In-Pairs test, although both solitary and social play types occurred, they were unrelated, and solitary locomotor play of one rat did not predict the subsequent play behavior of its cage mate. Analysis of play structure revealed that social play occurred more often in bouts of repeated behaviors while solitary play sequences did not follow a specific pattern. If play is to be used as an indicator of positive welfare in rats, context, type and structure differences should be taken into account.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.