108 resultados para APPROXIMATE PROGRAMMING STRATEGY


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managers can craft effective integrated strategy by properly assessing regulatory uncertainty. Leveraging the existing political markets literature, we predict regulatory uncertainty from the novel interaction of demand and supply side rivalries across a range of political markets. We argue for two primary drivers of regulatory uncertainty: ideology-motivated interests opposed to the firm and a lack of competition for power among political actors supplying public policy. We align three, previously disparate dimensions of nonmarket strategy - profile level, coalition breadth, and pivotal target - to levels of regulatory uncertainty. Through this framework, we demonstrate how and when firms employ different nonmarket strategies. To illustrate variation in nonmarket strategy across levels of regulatory uncertainty, we analyze several market entry decisions of foreign firms operating in the global telecommunications sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous studies have shown that regulated firms diversify for reasons that are different than for unregulated firms. We explore some of these differences by providing a theoretical model that starts by considering the firm-regulator relationship as an incomplete information issue, in which a regulated incumbent has knowledge that the regulator does not have, but the firm cannot convey hard information about this knowledge. The incumbent faces both market and nonmarket competition from a new entrant. In that context, we show that when the firm faces tough nonmarket competition domestically, going abroad can create a mechanism that makes information transmission to the regulator more credible. International expansion can thus be a way to solve domestic nonmarket issues in addition to being a catalyst for growth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The perceived low levels of genetic diversity, poor interspecific competitive and defensive ability, and loss of dispersal capacities of insular lineages have driven the view that oceanic islands are evolutionary dead ends. Focusing on the Atlantic bryophyte flora distributed across the archipelagos of the Azores, Madeira, the Canary Islands, Western Europe, and northwestern Africa, we used an integrative approach with species distribution modeling and population genetic analyses based on approximate Bayesian computation to determine whether this view applies to organisms with inherent high dispersal capacities. Genetic diversity was found to be higher in island than in continental populations, contributing to mounting evidence that, contrary to theoretical expectations, island populations are not necessarily genetically depauperate. Patterns of genetic variation among island and continental populations consistently fitted those simulated under a scenario of de novo foundation of continental populations from insular ancestors better than those expected if islands would represent a sink or a refugium of continental biodiversity. We, suggest that the northeastern Atlantic archipelagos have played a key role as a stepping stone for transoceanic migrants. Our results challenge the traditional notion that oceanic islands are the end of the colonization road and illustrate the significant role of oceanic islands as reservoirs of novel biodiversity for the assembly of continental floras.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demyelinating diseases are characterized by a loss of oligodendrocytes leading to axonal degeneration and impaired brain function. Current strategies used for the treatment of demyelinating disease such as multiple sclerosis largely rely on modulation of the immune system. Only limited treatment options are available for treating the later stages of the disease, and these treatments require regenerative therapies to ameliorate the consequences of oligodendrocyte loss and axonal impairment. Directed differentiation of adult hippocampal neural stem/progenitor cells (NSPCs) into oligodendrocytes may represent an endogenous source of glial cells for cell-replacement strategies aiming to treat demyelinating disease. Here, we show that Ascl1-mediated conversion of hippocampal NSPCs into mature oligodendrocytes enhances remyelination in a diphtheria-toxin (DT)-inducible, genetic model for demyelination. These findings highlight the potential of targeting hippocampal NSPCs for the treatment of demyelinated lesions in the adult brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: To provide an overview of available evidence of the potential role of epigenetics in the pathogenesis of hypertension and vascular dysfunction. RECENT FINDINGS: Arterial hypertension is a highly heritable condition. Surprisingly, however, genetic variants only explain a tiny fraction of the phenotypic variation and the term 'missing heritability' has been coined to describe this phenomenon. Recent evidence suggests that phenotypic alteration that is unrelated to changes in DNA sequence (thereby escaping detection by classic genetic methodology) offers a potential explanation. Here, we present some basic information on epigenetics and review recent work consistent with the hypothesis of epigenetically induced arterial hypertension. SUMMARY: New technologies that enable the rigorous assessment of epigenetic changes and their phenotypic consequences may provide the basis for explaining the missing heritability of arterial hypertension and offer new possibilities for treatment and/or prevention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the various strategies to reduce the incidence of non-communicable diseases reduction of sodium intake in the general population has been recognized as one of the most cost-effective means because of its potential impact on the development of hypertension and cardiovascular diseases. Yet, this strategic health recommendation of the WHO and many other international organizations is far from being universally accepted. Indeed, there are still several unresolved scientific and epidemiological questions that maintain an ongoing debate. Thus what is the adequate low level of sodium intake to recommend to the general population and whether national strategies should be oriented to the overall population or only to higher risk fractions of the population such as salt-sensitive patients are still discussed. In this paper, we shall review the recent results of the literature regarding salt, blood pressure and cardiovascular risk and we present the recommendations recently proposed by a group of experts of Switzerland. The propositions of the participating medical societies are to encourage national health authorities to continue their discussion with the food industry in order to reduce the sodium intake of food products with a target of mean salt intake of 5-6 grams per day in the population. Moreover, all initiatives to increase the information on the effect of salt on health and on the salt content of food are supported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research has shown that power increases focus on the main goal when distractor information is present. As a result, high-power people have been described as goal-focused. In real life, one typically wants to pursue multiple goals at the same time. There is a lack of research on how power affects how people deal with situations in which multiple important goals are present. To address this question, 158 participants were primed with high or low power or assigned to a control condition, and were asked to perform a dual-goal task with three difficulty levels. We hypothesized and found that high-power primed people prioritize when confronted with a multiple-goal situation. More specifically, when task demands were relatively low, power had no effect; participants generally pursued multiple goals in parallel. However, when task demands were high, the participants in the high-power condition focused on a single goal whereas participants in the low-power condition continued using a dualtask strategy. This study extends existing power theories and research in the domain of goal pursuit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of globalized competition among territories, cities, regions and countries have to find new ways to be attractive to companies, investors, tourists and residents. In that perspective, major sports events (such as the Olympic Games or the FIFA World Cup) are often seen as a lever for territorial development. Based on that idea, many sports events hosting strategies have emerged in the 1980s and 1990s. However, the growing competition in the sports events' market and the gigantism of those major events, forced some territories to turn to smaller events. This necessary resize of their strategy raises the question of their capacity to meet the initial objectives, which aim usually at developing the economy and promoting the image of the host destination. This essay sketches out the evolution of a sports events hosting strategy in a city that does not have the resources (either financial, human or in terms of infrastructures) to attract major international sports events. The challenges they have to face and a possible solution based on the event portfolio perspective are discussed through the article.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Key Messages: A fundamental failure of high-risk prevention strategies is their inability to prevent disease in the large part of the population at a relatively small average risk and from which most cases of diseases originate. The development of individual predictive medicine and the widening of high-risk categories for numerous (chronic) conditions lead to the application of pseudo-high-risk prevention strategies. Widening the criteria justifying individual preventive interventions and the related pseudo-high-risk strategies lead to treating, individually, ever healthier and larger strata of the population. The pseudo-high-risk prevention strategies raise similar problems compared with high-risk strategies, however on a larger scale and without any of the benefit of population-based strategies. Some 30 years ago, the strengths and weaknesses of population-based and high-risk prevention strategies were brilliantly delineated by Geoffrey Rose in several seminal publications (Table 1).1,2 His work had major implications not only for epidemiology and public health but also for clinical medicine. In particular, Rose demonstrated the fundamental failure of high-risk prevention strategies, that is, by missing a large number of preventable cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The light spectrum perceived by plants is affected by crowding, which results in the shade avoidance syndrome (SAS). Findings presented by Pedmale et al. bring cryptochromes to the forefront of SAS and elucidate a fascinating molecular crosstalk between photoreceptor systems operating in different wavebands.