927 resultados para Multiple Change-point Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The primary purpose of this exploratory empirical study is to examine the structural stability of a limited number of alternative explanatory factors of strategic change. On the basis of theoretical arguments and prior empirical evidence from two traditional perspectives, we propose an original empirical framework to analyse whether these potential explanatory factors have remained stable over time in a highly turbulent environment. This original question is explored in a particular setting: the population of Spanish private banks. The firms of this industry have experienced a high level of strategic mobility as a consequence of fundamental changes undergone in their environmental conditions over the last two decades (mainly changes related to the new banking and financial regulation process). Our results consistently support that the effect of most explanatory factors of strategic mobility considered did not remain stable over the whole period of analysis. From this point of view, the study sheds new light on major debates and dilemmas in the field of strategy regarding why firms change their competitive patterns over time and, hence, to what extent the "contextdependency" of alternative views of strategic change as their relative validation can vary over time for a given population. Methodologically, this research makes two major contributions to the study of potential determinants of strategic change. First, the definition and measurement of strategic change employing a new grouping method, the Model-based Cluster Method or MCLUST. Second, in order to asses the possible effect of determinants of strategic mobility we have controlled the non-observable heterogeneity using logistic regression models for panel data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper shows the numerous problems of conventional economic analysis in the evaluation of climate change mitigation policies. The article points out the many limitations, omissions, and the arbitrariness that have characterized most evaluation models applied up until now. These shortcomings, in an almost overwhelming way, have biased the result towards the recommendation of a lower aggressiveness of emission mitigation policies. Consequently, this paper questions whether these results provide an appropriate answer to the problem. Finally, various points that an analysis coherent with sustainable development should take into account are presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Macroeconomic activity has become less volatile over the past three decades in most G7 economies. Current literature focuses on the characterization of the volatility reduction and explanations for this so called "moderation" in each G7 economy separately. In opposed to individual country analysis and individual variable analysis, this paper focuses on common characteristics of the reduction and common explanations for the moderation in G7 countries. In particular, we study three explanations: structural changes in the economy, changes in common international shocks and changes in domestic shocks. We study these explanations in a unified model structure. To this end, we propose a Bayesian factor structural vector autoregressive model. Using the proposed model, we investigate whether we can find common explanations for all G7 economies when information is pooled from multiple domestic and international sources. Our empirical analysis suggests that volatility reductions can largely be attributed to the decline in the magnitudes of the shocks in most G7 countries while only for the U.K., the U.S. and Italy they can partially be attributed to structural changes in the economy. Analyzing the components of the volatility, we also find that domestic shocks rather than common international shocks can account for a large part of the volatility reduction in most of the G7 countries. Finally, we find that after mid-1980s the structure of the economy changes substantially in five of the G7 countries: Germany, Italy, Japan, the U.K. and the U.S..

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article provides a fresh methodological and empirical approach for assessing price level convergence and its relation to purchasing power parity (PPP) using annual price data for seventeen US cities. We suggest a new procedure that can handle a wide range of PPP concepts in the presence of multiple structural breaks using all possible pairs of real exchange rates. To deal with cross-sectional dependence, we use both cross-sectional demeaned data and a parametric bootstrap approach. In general, we find more evidence for stationarity when the parity restriction is not imposed, while imposing parity restriction provides leads toward the rejection of the panel stationar- ity. Our results can be embedded on the view of the Balassa-Samuelson approach, but where the slope of the time trend is allowed to change in the long-run. The median half-life point estimate are found to be lower than the consensus view regardless of the parity restriction.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Context: Understanding the process through which adolescents and young adults are trying legal and illegal substances is a crucial point for the development of tailored prevention and treatment programs. However, patterns of substance first use can be very complex when multiple substances are considered, requiring reduction into a few meaningful number of categories. Data: We used data from a survey on adolescent and young adult health conducted in 2002 in Switzerland. Answers from 2212 subjects aged 19 and 20 were included. The first consumption ever of 10 substances (tobacco, cannabis, medicine to get high, sniff (volatile substances, and inhalants), ecstasy, GHB, LSD, cocaine, methadone, and heroin) was considered for a grand total of 516 different patterns. Methods: In a first step, automatic clustering was used to decrease the number of patterns to 50. Then, two groups of substance use experts, three social field workers, and three toxicologists and health professionals, were asked to reduce them into a maximum of 10 meaningful categories. Results: Classifications obtained through our methodology are of practical interest by revealing associations invisible to purely automatic algorithms. The article includes a detailed analysis of both final classifications, and a discussion on the advantages and limitations of our approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Domestic action on climate change is increasingly important in the light of the difficulties with international agreements and requires a combination of solutions, in terms of institutions and policy instruments. One way of achieving government carbon policy goals may be the creation of an independent body to advise, set or monitor policy. This paper critically assesses the Committee on Climate Change (CCC), which was created in 2008 as an independent body to help move the UK towards a low carbon economy. We look at the motivation for its creation in terms of: information provision, advice, monitoring, or policy delegation. In particular we consider its ability to overcome a time inconsistency problem by comparing and contrasting it with another independent body, the Monetary Policy Committee of the Bank of England. In practice the Committee on Climate Change appears to be the ‘inverse’ of the Monetary Policy Committee, in that it advises on what the policy goal should be rather than being responsible for achieving it. The CCC incorporates both advisory and monitoring functions to inform government and achieve a credible carbon policy over a long time frame. This is a similar framework to that adopted by Stern (2006), but the CCC operates on a continuing basis. We therefore believe the CCC is best viewed as a "Rolling Stern plus" body. There are also concerns as to how binding the budgets actually are and how the budgets interact with other energy policy goals and instruments, such as Renewable Obligation Contracts and the EU Emissions Trading Scheme. The CCC could potentially be reformed to include: an explicit information provision role; consumption-based accounting of emissions and control of a policy instrument such as a balanced-budget carbon tax.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A statistical methodology is developed by which realised outcomes can be used to identify, for calendar years between 1974 and 2012, when policy makers in ‘advanced’ economies have successfully pursued single objectives of different kinds, or multiple objectives. A simple criterion is then used to distinguish between multiple objectives pure and simple and multiple objectives subject to a price stability constraint. The overall and individual country results which this methodology produces seem broadly plausible. Unconditional and conditional analyses of the inflation and growth associated with different types of objectives reveal that multiple objectives subject to a price stability constraint are associated with roughly as good economic performance as the single objective of inflation. A proposal is then made as to how the remit of an inflation-targeting central bank could be adjusted to allow it to pursue other objectives in extremis without losing the credibility effects associated with inflation targeting.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Management of blood pressure (BP) in acute ischemic stroke is controversial. The present study aims to explore the association between baseline BP levels and BP change and outcome in the overall stroke population and in specific subgroups with regard to the presence of arterial hypertensive disease and prior antihypertensive treatment. METHODS: All patients registered in the Acute STroke Registry and Analysis of Lausanne (ASTRAL) between 2003 and 2009 were analyzed. Unfavorable outcome was defined as modified Rankin score more than 2. A local polynomial surface algorithm was used to assess the effect of BP values on outcome in the overall population and in predefined subgroups. RESULTS: Up to a certain point, as initial BP was increasing, optimal outcome was seen with a progressively more substantial BP decrease over the next 24-48 h. Patients without hypertensive disease and an initially low BP seemed to benefit from an increase of BP. In patients with hypertensive disease, initial BP and its subsequent changes seemed to have less influence on clinical outcome. Patients who were previously treated with antihypertensives did not tolerate initially low BPs well. CONCLUSION: Optimal outcome in acute ischemic stroke may be determined not only by initial BP levels but also by the direction and magnitude of associated BP change over the first 24-48 h.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Despite the advantage of avoiding the costs of sexual reproduction, asexual vertebrates are very rare and often considered evolutionarily disadvantaged when compared to sexual species. Asexual species, however, may have advantages when colonizing (new) habitats or competing with sexual counterparts. They are also evolutionary older than expected, leaving the question whether asexual vertebrates are not only rare because of their 'inferior' mode of reproduction but also because of other reasons. A paradigmatic model system is the unisexual Amazon molly, Poecilia formosa, that arose by hybridization of the Atlantic molly, Poecilia mexicana, as the maternal ancestor, and the sailfin molly, Poecilia latipinna, as the paternal ancestor. Our extensive crossing experiments failed to resynthesize asexually reproducing (gynogenetic) hybrids confirming results of previous studies. However, by producing diploid eggs, female F(1) -hybrids showed apparent preadaptation to gynogenesis. In a range-wide analysis of mitochondrial sequences, we examined the origin of P. formosa. Our analyses point to very few or even a single origin(s) of its lineage, which is estimated to be approximately 120,000 years old. A monophyletic origin was supported from nuclear microsatellite data. Furthermore, a considerable degree of genetic variation, apparent by high levels of clonal microsatellite diversity, was found. Our molecular phylogenetic evidence and the failure to resynthesize the gynogenetic P. formosa together with the old age of the species indicate that some unisexual vertebrates might be rare not because they suffer the long-term consequences of clonal reproduction but because they are only very rarely formed as a result of complex genetic preconditions necessary to produce viable and fertile clonal genomes and phenotypes ('rare formation hypothesis').

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: The course of alcohol consumption and cognitive dimensions of behavior change (readiness to change, importance of changing and confidence in ability to change) in primary care patients are not well described. The objective of the study was to determine changes in readiness, importance and confidence after a primary care visit, and 6-month improvements in both drinking and cognitive dimensions of behavior change, in patients with unhealthy alcohol use. METHODS: Prospective cohort study of patients with unhealthy alcohol use visiting primary care physicians, with repeated assessments of readiness, importance, and confidence (visual analogue scale (VAS), score range 1-10 points). Improvements 6 months later were defined as no unhealthy alcohol use or any increase in readiness, importance, or confidence. Regression models accounted for clustering by physician and adjusted for demographics, alcohol consumption and related problems, and discussion with the physician about alcohol. RESULTS: From before to immediately after the primary care physician visit, patients (n = 173) had increases in readiness (mean +1.0 point), importance (+0.2), and confidence (+0.5) (all p < 0.002). In adjusted models, discussion with the physician about alcohol was associated with increased readiness (+0.8, p = 0.04). At 6 months, many participants had improvements in drinking or readiness (62%), drinking or importance (58%), or drinking or confidence (56%). CONCLUSION: Readiness, importance and confidence improve in many patients with unhealthy alcohol use immediately after a primary care visit. Six months after a visit, most patients have improvements in either drinking or these cognitive dimensions of behavior change.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background and purpose: Decision making (DM) has been defined as the process through which a person forms preferences, selects and executes actions, and evaluates the outcome related to a selected choice. This ability represents an important factor for adequate behaviour in everyday life. DM impairment in multiple sclerosis (MS) has been previously reported. The purpose of the present study was to assess DM in patients with MS at the earliest clinically detectable time point of the disease. Methods: Patients with definite (n=109) or possible (clinically isolated syndrome, CIS; n=56) MS, a short disease duration (mean 2.3 years) and a minor neurological disability (mean EDSS 1.8) were compared to 50 healthy controls aged 18 to 60 years (mean age 32.2) using the Iowa Gambling Task (IGT). Subjects had to select a card from any of 4 decks (A/B [disadvantageous]; C/D [advantageous]). The game consisted of 100 trials then grouped in blocks of 20 cards for data analysis. Skill in DM was assessed by means of a learning index (LI) defined as the difference between the averaged last three block indexes and first two block indexes (LI=[(BI-3+BI-4+BI-5)/3-(BI-1+B2)/2]). Non parametric tests were used for statistical analysis. Results: LI was higher in the control group (0.24, SD 0.44) than in the MS group (0.21, SD 0.38), however without reaching statistical significance (p=0.7). Interesting differences were detected when MS patients were grouped according to phenotype. A trend to a difference between MS subgroups and controls was observed for LI (p=0.06), which became significant between MS subgroups (p=0.03). CIS patients who confirmed MS diagnosis by presenting a second relapse after study entry showed a dysfunction in the IGT in comparison to the other CIS (p=0.01) and definite MS (p=0.04) patients. In the opposite, CIS patients characterised by not entirely fulfilled McDonald criteria at inclusion and absence of relapse during the study showed an normal learning pattern on the IGT. Finally, comparing MS patients who developed relapses after study entry, those who remained clinically stable and controls, we observed impaired performances only in relapsing patients in comparison to stable patients (p=0.008) and controls (p=0.03). Discussion: These results raise the assumption of a sustained role for both MS relapsing activity and disease heterogeneity (i.e. infra-clinical severity or activity of MS) in the impaired process of decision making.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Twenty-one Mycobacterium avium multisolates, from ten human immunodeficiency virus-infected patients, were typed by restriction fragment length polymorphism using as marker the IS1245 and characterized by minimum inhibitory concentration for nine different antibiotics. Two out of four patients harboring multisolates with different fingerprint profile, were therefore considered as having a polyclonal infection, since their isolates were taken from sterile site. This result confirms that polyclonal infection caused by M. avium occurs with a nonnegligenciable frequency. Analyzing the multisolates susceptibility profile of each patient it was observed that most of them were infected with strains having appreciably different antimicrobial susceptibility patterns, no matter what the genotypic pattern of the strains was. These results have strong implication for the treatment of the patients.