964 resultados para Argentine default


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mechanochemical transduction enables an extraordinary range of physiological processes such as the sense of touch, hearing, balance, muscle contraction, and the growth and remodelling of tissue and
bone1–6. Although biology is replete with materials systems that actively and functionally respond to mechanical stimuli, the default mechanochemical reaction of bulk polymers to large external stress is the unselective scission of covalent bonds, resulting in damage or failure7. An alternative to this degradation process is the rational molecular design of synthetic materials such that mechanical stress
favourably altersmaterial properties. A few mechanosensitive polymers with this property have been developed8–14; but their active response is mediated through non-covalent processes, which may
limit the extent to which properties can be modified and the longterm stability in structural materials. Previously, we have shown with dissolved polymer strands incorporating mechanically sensitive chemical groups—so-called mechanophores—that the directional nature of mechanical forces can selectively break and re-form covalent bonds15,16. We now demonstrate that such forceinduced covalent-bond activation can also be realized with mechanophore-linked elastomeric and glassy polymers, by using a mechanophore that changes colour as it undergoes a reversible electrocyclic ring-opening reaction under tensile stress and thus allows us to directly and locally visualize the mechanochemical reaction. We find that pronounced changes in colour and fluorescence emerge with the accumulation of plastic deformation, indicating that in these polymeric materials the transduction of mechanical force into the ring-opening reaction is an activated process. We anticipate that force activation of covalent bonds can serve as a general strategy for the development of new mechanophore building blocks that impart polymeric materials with desirable functionalities ranging from damage sensing to fully regenerative self-healing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report a first study of brain activity linked to task switching in individuals with Prader-Willi syndrome (PWS) PWS individuals show a specific cognitive deficit in task switching which may be associated with the display of temper outbursts and repetitive questioning The performance of participants with PWS and typically developing controls was matched in a cued task switching procedure and brain activity was contrasted on switching and non switching blocks using SARI Individuals with PWS did not show the typical frontal-parietal pattern of neural activity associated with switching blocks, with significantly reduced activation in regions of the posterior parietal and ventromedial prefrontal cortices We suggest that this is linked to a difficulty in PWS in setting appropriate attentional weights to enable task set reconfiguration In addition to this, PWS individuals did not show the typical pattern of deactivation, with significantly less deactivation in an anterior region of the ventromedial prefrontal cortex One plausible explanation for this is that individuals with PWS show dysfunction within the default mode network which has been linked to attentional control The data point to functional changes in the neural circuitry supporting task switching in PWS even when behavioural performance is matched to controls and thus highlight neural mechanisms that may be involved in a specific pathway between genes cognition and behaviour (C) 2010 Elsevier B V All rights reserved

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explored the development of sensitivity to causal relations in children’s inductive reasoning. Children (5-, 8-, and 12-year-olds) and adults were given trials in which they decided whether a property known to be possessed by members of one category was also possessed by members of (a) a taxonomically related category or (b) a causally related category. The direction of the causal link was either predictive (prey → predator) or diagnostic (predator → prey), and the property that participants reasoned about established either a taxonomic or causal context. There was a causal asymmetry effect across all age groups, with more causal choices when the causal link was predictive than when it was diagnostic. Furthermore, context-sensitive causal reasoning showed a curvilinear development, with causal choices being most frequent for 8-year-olds regardless of context. Causal inductions decreased thereafter because 12-year-olds and adults made more taxonomic choices when reasoning in the taxonomic context. These findings suggest that simple causal relations may often be the default knowledge structure in young children’s inductive reasoning, that sensitivity to causal direction is present early on, and that children over-generalize their causal knowledge when reasoning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematical modelling has become an essential tool in the design of modern catalytic systems. Emissions legislation is becoming increasingly stringent, and so mathematical models of aftertreatment systems must become more accurate in order to provide confidence that a catalyst will convert pollutants over the required range of conditions. 
Automotive catalytic converter models contain several sub-models that represent processes such as mass and heat transfer, and the rates at which the reactions proceed on the surface of the precious metal. Of these sub-models, the prediction of the surface reaction rates is by far the most challenging due to the complexity of the reaction system and the large number of gas species involved. The reaction rate sub-model uses global reaction kinetics to describe the surface reaction rate of the gas species and is based on the Langmuir Hinshelwood equation further developed by Voltz et al. [1] The reactions can be modelled using the pre-exponential and activation energies of the Arrhenius equations and the inhibition terms. 
The reaction kinetic parameters of aftertreatment models are found from experimental data, where a measured light-off curve is compared against a predicted curve produced by a mathematical model. The kinetic parameters are usually manually tuned to minimize the error between the measured and predicted data. This process is most commonly long, laborious and prone to misinterpretation due to the large number of parameters and the risk of multiple sets of parameters giving acceptable fits. Moreover, the number of coefficients increases greatly with the number of reactions. Therefore, with the growing number of reactions, the task of manually tuning the coefficients is becoming increasingly challenging. 
In the presented work, the authors have developed and implemented a multi-objective genetic algorithm to automatically optimize reaction parameters in AxiSuite®, [2] a commercial aftertreatment model. The genetic algorithm was developed and expanded from the code presented by Michalewicz et al. [3] and was linked to AxiSuite using the Simulink add-on for Matlab. 
The default kinetic values stored within the AxiSuite model were used to generate a series of light-off curves under rich conditions for a number of gas species, including CO, NO, C3H8 and C3H6. These light-off curves were used to generate an objective function. 
This objective function was used to generate a measure of fit for the kinetic parameters. The multi-objective genetic algorithm was subsequently used to search between specified limits to attempt to match the objective function. In total the pre-exponential factors and activation energies of ten reactions were simultaneously optimized. 
The results reported here demonstrate that, given accurate experimental data, the optimization algorithm is successful and robust in defining the correct kinetic parameters of a global kinetic model describing aftertreatment processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Why do some banks fail in financial crises while others survive? This article answers this question by analysing the effect of the Dutch financial crisis of the 1920s on 142 banks, of which 33 failed. We find that choices of balance sheet composition and product market strategy made in the lead-up to the crisis had a significant impact on banks’ subsequent chances of experiencing distress. We document that high-risk banks – those operating highly-leveraged portfolios and attracting large quantities of deposits – were more likely to fail. Branching and international activities also increased banks’ default probabilities. We measure the effects of board interlocks, which have been characterized in the extant literature as contributing to the Dutch crisis. We find that boards mattered: failing banks had smaller boards, shared directors with smaller and very profitable banks and had a lower concentration of interlocking directorates in non-financial firms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European Union has set a target for 10% renewable energy in transport by 2020 to be met using biofuels and electric vehicles. In the case of biofuels, the biofuel must achieve greenhouse gas savings of 35% relative to the fossil fuel replaced. For biofuels, greenhouse gas savings can be calculated using life cycle analysis or the European Union default values. In contrast, all electricity used in transport is considered to be the same, regardless of the source or the type of electric vehicle. However, the choice of the electric vehicle and electricity source will have a major impact on the greenhouse gas saving. In this paper the initial findings of a well-to-wheel analysis of electric vehicle deployment in Northern Ireland are presented. The key finding indicates that electric vehicles require least amount of energy per mile on a well-to-wheel basis, consume the fewest resources, even accommodating inefficient fuel production, in comparison to standard internal combustion engine and hybrid vehicles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantile normalization (QN) is a technique for microarray data processing and is the default normalization method in the Robust Multi-array Average (RMA) procedure, which was primarily designed for analysing gene expression data from Affymetrix arrays. Given the abundance of Affymetrix microarrays and the popularity of the RMA method, it is crucially important that the normalization procedure is applied appropriately. In this study we carried out simulation experiments and also analysed real microarray data to investigate the suitability of RMA when it is applied to dataset with different groups of biological samples. From our experiments, we showed that RMA with QN does not preserve the biological signal included in each group, but rather it would mix the signals between the groups. We also showed that the Median Polish method in the summarization step of RMA has similar mixing effect. RMA is one of the most widely used methods in microarray data processing and has been applied to a vast volume of data in biomedical research. The problematic behaviour of this method suggests that previous studies employing RMA could have been misadvised or adversely affected. Therefore we think it is crucially important that the research community recognizes the issue and starts to address it. The two core elements of the RMA method, quantile normalization and Median Polish, both have the undesirable effects of mixing biological signals between different sample groups, which can be detrimental to drawing valid biological conclusions and to any subsequent analyses. Based on the evidence presented here and that in the literature, we recommend exercising caution when using RMA as a method of processing microarray gene expression data, particularly in situations where there are likely to be unknown subgroups of samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The West has failed to properly integrate Russia into its worldview since 1991, and there is an obvious vacuum of ideas for how to deal with it. The default reaction is to fall back on the Cold War paradigm - sanctions, containment, and hopes of Russian regime change.

This is folly. There’s no knowing how long it will take for Russia to change tack, if it ever does; nothing guarantees that a new regime in Russia would be any more pro-Western. There’s also apparently no idea how to handle Russia in the meantime, especially while it remains a crucial part of crises like those in Iran and Syria.

Ukraine has shown that the placeholder post-Cold War order Europe and Russia inherited urgently needs replacing. With a ceasefire in place at last, the search for an alternative is on. The Geneva talks in April this year could be its basis; but nothing truly transformative will be achieved until the US, EU, Russia and Ukraine all recognise the need for compromise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Libertarian paternalism, as advanced by Cass Sunstein, is seriously flawed, but not primarily for the reasons that most commentators suggest. Libertarian paternalism and its attendant regulatory implications are too libertarian, not too paternalistic, and as a result are in considerable tension with ‘thick’ conceptions of human dignity. We make four arguments. The first is that there is no justification for a presumption in favor of nudging as a default regulatory strategy, as Sunstein asserts. It is ordinarily less effective than mandates; such mandates rarely offend personal autonomy; and the central reliance on cognitive failures in the nudging program is more likely to offend human dignity than the mandates it seeks to replace. Secondly, we argue that nudging as a regulatory strategy fits both overtly and covertly, often insidiously, into a more general libertarian program of political economy. Thirdly, while we are on the whole more concerned to reject the libertarian than the paternalistic elements of this philosophy, Sunstein’s work, both in Why Nudge?, and earlier, fails to appreciate how nudging may be manipulative if not designed with more care than he acknowledges. Lastly, because of these characteristics, nudging might even be subject to legal challenges that would give us the worst of all possible regulatory worlds: a weak regulatory intervention that is liable to be challenged in the courts by well-resourced interest groups. In such a scenario, and contrary to the ‘common sense’ ethos contended for in Why Nudge?, nudges might not even clear the excessively low bar of doing something rather than nothing. Those seeking to pursue progressive politics, under law, should reject nudging in favor of regulation that is more congruent with principles of legality, more transparent, more effective, more democratic, and allows us more fully to act as moral agents. Such a system may have a place for (some) nudging, but not one that departs significantly from how labeling, warnings and the like already function, and nothing that compares with Sunstein’s apparent ambitions for his new movement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive continuum damage mechanics model [1] had been developed to capture the detailed
behaviour of a composite structure under a crushing load. This paper explores some of the difficulties
encountered in the implementation of this model and their mitigation. The use of reduced integration
element and a strain softening model both negatively affect the accuracy and stability of the
simulation. Damage localisation effects demanded an accurate measure of characteristic length. A
robust algorithm for determining the characteristic length was implemented. Testing showed that this
algorithm produced marked improvements over the use of the default characteristic length provided
by Abaqus. Zero-energy or hourglass modes, in reduced integration elements, led to reduced
resistance to bending. This was compounded by the strain softening model, which led to the formation
of elements with little resistance to deformation that could invert if left unchecked. It was shown,
through benchmark testing, that by deleting elements with excess distortions and controlling the mesh
using inbuilt distortion/hourglass controls, these issues can be alleviated. These techniques
contributed significantly to the viability and usability of the damage model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Where either the seller or buyer of landed property fails to complete a contract to sell land the non-breaching party has a right to seek specific performance of the contract. This remedy would compel the party in default to perform the contract on pain of being held in contempt of court if the court's order is not obeyed. The defaulting party would not be able to satisfy its obligations under the law by paying a sum of money as damages for breach of contract. This paper considers the impecuniosity defence to specific performance as recognised by courts in Northern Ireland, the Republic of Ireland, Australia and New Zealand. Where the buyer demonstrates that he or she simply cannot raise the funds to buy the property specific performance will not be decreed and the court will make an award of damages for breach of contract measured by the difference between the contract price and the market price of the property at the time of default. The paper considers the nature and parameters of this defence and how it differs (if at all) from the alternative defence of extreme hardship. The paper addresses the question of whether it might be better to move to a position where sellers of land in all cases no longer enjoy a presumption of specific performance but have to demonstrate that the alternative remedy of damages is clearly inadequate. If this should be so the paper goes on to consider whether abolition of the presumption in favour of specific performance for sellers should lead to abolition of the presumption of specific performance for buyers, as is the position in Canada following the Supreme Court's decision in Semelhago v Paramadevan [1996] 2 SCR 415.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article is a reflexive and critical examination of recent empirical research on effective practice in the management and ‘transformation’ of contested urban space at sectarian interfaces in Belfast. By considering the development of interfaces, the areas around them and policy responses to their persistence, the reality of contested space in the context of ‘peace building’ is apparent; with implications for local government as central to the statutory response. Belfast has developed an inbuilt absence of connectivity; where freedom of movement is particularly restricted and separation of contested space is the policy default position. Empirical research findings focus attention on the significance of social and economic regeneration and fall into three specific areas that reflect both long-term concerns within neighbourhoods and the need for adequate policy responses and action ‘on the ground’. Drawing on Elden and Sassen we reconfigure the analytical framework by which interfaces are defined, with implications for policy and practice in post-conflict Belfast. Past and current policy for peace-building in Northern Ireland, and transforming the most contested space, at interfaces in Belfast, is deliberately ambiguous and offers little substance having failed to advance from funding-led linguistic compliance to a sustainable peace-building methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Portugal, it was estimated that around 1.95 Mton/year of wood is used in residential wood burning for heating and cooking. Additionally, in the last decades, burnt forest area has also been increasing. These combustions result in high levels of toxic air pollutants and a large perturbation of atmospheric chemistry, interfere with climate and have adverse effects on health. Accurate quantification of the amounts of trace gases and particulate matter emitted from residential wood burning, agriculture and garden waste burning and forest fires on a regional and global basis is essential for various purposes, including: the investigation of several atmospheric processes, the reporting of greenhouse gas emissions, and quantification of the air pollution sources that affect human health at regional scales. In Southern Europe, data on detailed emission factors from biomass burning are rather inexistent. Emission inventories and source apportionment, photochemical and climate change models use default values obtained for US and Northern Europe biofuels. Thus, it is desirable to use more specific locally available data. The objective of this study is to characterise and quantify the contribution of biomass combustion sources to atmospheric trace gases and aerosol concentrations more representative of the national reality. Laboratory (residential wood combustion) and field (agriculture/garden waste burning and experimental wildland fires) sampling experiments were carried out. In the laboratory, after the selection of the most representative wood species and combustion equipment in Portugal, a sampling program to determine gaseous and particulate matter emission rates was set up, including organic and inorganic aerosol composition. In the field, the smoke plumes from agriculture/garden waste and experimental wildland fires were sampled. The results of this study show that the combustion equipment and biofuel type used have an important role in the emission levels and composition. Significant differences between the use of traditional combustion equipment versus modern equipments were also observed. These differences are due to higher combustion efficiency of modern equipment, reflecting the smallest amount of particulate matter, organic carbon and carbon monoxide released. With regard to experimental wildland fires in shrub dominated areas, it was observed that the largest organic fraction in the samples studied was mainly composed by vegetation pyrolysis products. The major organic components in the smoke samples were pyrolysates of vegetation cuticles, mainly comprising steradienes and sterol derivatives, carbohydrates from the breakdown of cellulose, aliphatic lipids from vegetation waxes and methoxyphenols from the lignin thermal degradation. Despite being a banned practice in our country, agriculture/garden waste burning is actually quite common. To assess the particulate matter composition, the smoke from three different agriculture/garden residues have been sampled into 3 different size fractions (PM2.5, PM2.5-10 and PM>10). Despite distribution patterns of organic compounds in particulate matter varied among residues, the amounts of phenolics (polyphenol and guaiacyl derivatives) and organic acids were always predominant over other organic compounds in the organosoluble fraction of smoke. Among biomarkers, levoglucosan, β-sitosterol and phytol were detected in appreciable amounts in the smoke of all agriculture/garden residues. In addition, inositol may be considered as an eventual tracer for the smoke from potato haulm burning. It was shown that the prevailing ambient conditions (such as high humidity in the atmosphere) likely contributed to atmospheric processes (e.g. coagulation and hygroscopic growth), which influenced the particle size characteristics of the smoke tracers, shifting their distribution to larger diameters. An assessment of household biomass consumption was also made through a national scale survey. The information obtained with the survey combined with the databases on emission factors from the laboratory and field tests allowed us to estimate the pollutant amounts emitted in each Portuguese district. In addition to a likely contribution to the improvement of emission inventories, emission factors obtained for tracer compounds in this study can be applied in receptor models to assess the contribution of biomass burning to the levels of atmospheric aerosols and their constituents obtained in monitoring campaigns in Mediterranean Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sempre foi do interesse das instituições financeiras de crédito determinar o risco de incumprimento associado a uma empresa por forma a avaliar o seu perfil. No entanto, esta informação é útil a todos os stakeholders de uma empresa, já que também estes comprometem uma parte de si ao interagirem com esta. O aumento do número de insolvências nos últimos anos tem reafirmado a necessidade de ampliar e aprofundar a pesquisa sobre o stress financeiro. A identificação dos fatores que influenciam a determinação do preço dos ativos sempre foi do interesse de todos os stakeholders, por forma a antecipar a variação dos retornos e agir em sua conformidade. Nesta dissertação será estudada a influência do risco de incumprimento sobre os retornos de capital, usando como indicador do risco de incumprimento a probabilidade de incumprimento obtida segundo o modelo de opções de Merton (1974). Efetuou-se esta análise durante o período de Fevereiro de 2002 a Dezembro de 2011, utilizando dados de empresas Portuguesas, Espanholas e Gregas. Os resultados evidenciam uma relação negativa do risco de incumprimento com os retornos de capital, que é devida a um efeito momentum e à volatilidade. A par disso, também se demonstra que o tamanho e o book-to-market não são representativos do risco de incumprimento na amostra aqui utilizada, ao contrário do que Fama & French (1992; 1996) afirmavam.