972 resultados para Default penalties


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low-velocity impact damage can drastically reduce the residual strength of a composite structure even when the damage is barely visible. The ability to computationally predict the extent of damage and compression-after-impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant time and cost penalties. A high-fidelity three-dimensional composite damage model, to predict both low-velocity impact damage and CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The intralaminar damage model component accounts for physically-based tensile and compressive failure mechanisms, of the fibres and matrix, when subjected to a three-dimensional stress state. Cohesive behaviour was employed to model the interlaminar failure between plies with a bi-linear traction–separation law for capturing damage onset and subsequent damage evolution. The virtual tests, set up in ABAQUS/Explicit, were executed in three steps, one to capture the impact damage, the second to stabilize the specimen by imposing new boundary conditions required for compression testing, and the third to predict the CAI strength. The observed intralaminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing without the need of model calibration which is often required with other damage models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantile normalization (QN) is a technique for microarray data processing and is the default normalization method in the Robust Multi-array Average (RMA) procedure, which was primarily designed for analysing gene expression data from Affymetrix arrays. Given the abundance of Affymetrix microarrays and the popularity of the RMA method, it is crucially important that the normalization procedure is applied appropriately. In this study we carried out simulation experiments and also analysed real microarray data to investigate the suitability of RMA when it is applied to dataset with different groups of biological samples. From our experiments, we showed that RMA with QN does not preserve the biological signal included in each group, but rather it would mix the signals between the groups. We also showed that the Median Polish method in the summarization step of RMA has similar mixing effect. RMA is one of the most widely used methods in microarray data processing and has been applied to a vast volume of data in biomedical research. The problematic behaviour of this method suggests that previous studies employing RMA could have been misadvised or adversely affected. Therefore we think it is crucially important that the research community recognizes the issue and starts to address it. The two core elements of the RMA method, quantile normalization and Median Polish, both have the undesirable effects of mixing biological signals between different sample groups, which can be detrimental to drawing valid biological conclusions and to any subsequent analyses. Based on the evidence presented here and that in the literature, we recommend exercising caution when using RMA as a method of processing microarray gene expression data, particularly in situations where there are likely to be unknown subgroups of samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The West has failed to properly integrate Russia into its worldview since 1991, and there is an obvious vacuum of ideas for how to deal with it. The default reaction is to fall back on the Cold War paradigm - sanctions, containment, and hopes of Russian regime change.

This is folly. There’s no knowing how long it will take for Russia to change tack, if it ever does; nothing guarantees that a new regime in Russia would be any more pro-Western. There’s also apparently no idea how to handle Russia in the meantime, especially while it remains a crucial part of crises like those in Iran and Syria.

Ukraine has shown that the placeholder post-Cold War order Europe and Russia inherited urgently needs replacing. With a ceasefire in place at last, the search for an alternative is on. The Geneva talks in April this year could be its basis; but nothing truly transformative will be achieved until the US, EU, Russia and Ukraine all recognise the need for compromise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Porous solids such as zeolites and metal-organic frameworks are useful in molecular separation and in catalysis, but their solid nature can impose limitations. For example, liquid solvents, rather than porous solids, are the most mature technology for post-combustion capture of carbon dioxide because liquid circulation systems are more easily retrofitted to existing plants. Solid porous adsorbents offer major benefits, such as lower energy penalties in adsorption-desorption cycles, but they are difficult to implement in conventional flow processes. Materials that combine the properties of fluidity and permanent porosity could therefore offer technological advantages, but permanent porosity is not associated with conventional liquids. Here we report free-flowing liquids whose bulk properties are determined by their permanent porosity. To achieve this, we designed cage molecules that provide a well-defined pore space and that are highly soluble in solvents whose molecules are too large to enter the pores. The concentration of unoccupied cages can thus be around 500 times greater than in other molecular solutions that contain cavities, resulting in a marked change in bulk properties, such as an eightfold increase in the solubility of methane gas. Our results provide the basis for development of a new class of functional porous materials for chemical processes, and we present a one-step, multigram scale-up route for highly soluble 'scrambled' porous cages prepared from a mixture of commercially available reagents. The unifying design principle for these materials is the avoidance of functional groups that can penetrate into the molecular cage cavities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Libertarian paternalism, as advanced by Cass Sunstein, is seriously flawed, but not primarily for the reasons that most commentators suggest. Libertarian paternalism and its attendant regulatory implications are too libertarian, not too paternalistic, and as a result are in considerable tension with ‘thick’ conceptions of human dignity. We make four arguments. The first is that there is no justification for a presumption in favor of nudging as a default regulatory strategy, as Sunstein asserts. It is ordinarily less effective than mandates; such mandates rarely offend personal autonomy; and the central reliance on cognitive failures in the nudging program is more likely to offend human dignity than the mandates it seeks to replace. Secondly, we argue that nudging as a regulatory strategy fits both overtly and covertly, often insidiously, into a more general libertarian program of political economy. Thirdly, while we are on the whole more concerned to reject the libertarian than the paternalistic elements of this philosophy, Sunstein’s work, both in Why Nudge?, and earlier, fails to appreciate how nudging may be manipulative if not designed with more care than he acknowledges. Lastly, because of these characteristics, nudging might even be subject to legal challenges that would give us the worst of all possible regulatory worlds: a weak regulatory intervention that is liable to be challenged in the courts by well-resourced interest groups. In such a scenario, and contrary to the ‘common sense’ ethos contended for in Why Nudge?, nudges might not even clear the excessively low bar of doing something rather than nothing. Those seeking to pursue progressive politics, under law, should reject nudging in favor of regulation that is more congruent with principles of legality, more transparent, more effective, more democratic, and allows us more fully to act as moral agents. Such a system may have a place for (some) nudging, but not one that departs significantly from how labeling, warnings and the like already function, and nothing that compares with Sunstein’s apparent ambitions for his new movement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Limited access to bank branches excludes over one billion people from accessing financial services in developing countries. Digital financial services offered by banks and mobile money providers through agents can solve this problem without the need for complex and costly physical banking infrastructures. Delivering digital financial services through agents requires a legal framework to regulate liability. This article analyses whether vicarious liability of the principal is a more efficient regulatory approach than personal liability of the agent. Agent liability in Kenya, Fiji, and Malawi is analysed to demonstrate that vicarious liability of the principal, coupled to an explicit agreement as to agent rewards and penalties, is the more efficient regulatory approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive continuum damage mechanics model [1] had been developed to capture the detailed
behaviour of a composite structure under a crushing load. This paper explores some of the difficulties
encountered in the implementation of this model and their mitigation. The use of reduced integration
element and a strain softening model both negatively affect the accuracy and stability of the
simulation. Damage localisation effects demanded an accurate measure of characteristic length. A
robust algorithm for determining the characteristic length was implemented. Testing showed that this
algorithm produced marked improvements over the use of the default characteristic length provided
by Abaqus. Zero-energy or hourglass modes, in reduced integration elements, led to reduced
resistance to bending. This was compounded by the strain softening model, which led to the formation
of elements with little resistance to deformation that could invert if left unchecked. It was shown,
through benchmark testing, that by deleting elements with excess distortions and controlling the mesh
using inbuilt distortion/hourglass controls, these issues can be alleviated. These techniques
contributed significantly to the viability and usability of the damage model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Where either the seller or buyer of landed property fails to complete a contract to sell land the non-breaching party has a right to seek specific performance of the contract. This remedy would compel the party in default to perform the contract on pain of being held in contempt of court if the court's order is not obeyed. The defaulting party would not be able to satisfy its obligations under the law by paying a sum of money as damages for breach of contract. This paper considers the impecuniosity defence to specific performance as recognised by courts in Northern Ireland, the Republic of Ireland, Australia and New Zealand. Where the buyer demonstrates that he or she simply cannot raise the funds to buy the property specific performance will not be decreed and the court will make an award of damages for breach of contract measured by the difference between the contract price and the market price of the property at the time of default. The paper considers the nature and parameters of this defence and how it differs (if at all) from the alternative defence of extreme hardship. The paper addresses the question of whether it might be better to move to a position where sellers of land in all cases no longer enjoy a presumption of specific performance but have to demonstrate that the alternative remedy of damages is clearly inadequate. If this should be so the paper goes on to consider whether abolition of the presumption in favour of specific performance for sellers should lead to abolition of the presumption of specific performance for buyers, as is the position in Canada following the Supreme Court's decision in Semelhago v Paramadevan [1996] 2 SCR 415.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article is a reflexive and critical examination of recent empirical research on effective practice in the management and ‘transformation’ of contested urban space at sectarian interfaces in Belfast. By considering the development of interfaces, the areas around them and policy responses to their persistence, the reality of contested space in the context of ‘peace building’ is apparent; with implications for local government as central to the statutory response. Belfast has developed an inbuilt absence of connectivity; where freedom of movement is particularly restricted and separation of contested space is the policy default position. Empirical research findings focus attention on the significance of social and economic regeneration and fall into three specific areas that reflect both long-term concerns within neighbourhoods and the need for adequate policy responses and action ‘on the ground’. Drawing on Elden and Sassen we reconfigure the analytical framework by which interfaces are defined, with implications for policy and practice in post-conflict Belfast. Past and current policy for peace-building in Northern Ireland, and transforming the most contested space, at interfaces in Belfast, is deliberately ambiguous and offers little substance having failed to advance from funding-led linguistic compliance to a sustainable peace-building methodology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho surge do interesse em substituir os nós de rede óptica baseados maioritariamente em electrónica por nós de rede baseados em tecnologia óptica. Espera-se que a tecnologia óptica permita maiores débitos binários na rede, maior transparência e maior eficiência através de novos paradigmas de comutação. Segundo esta visão, utilizou-se o MZI-SOA, um dispositivo semicondutor integrado hibridamente, para realizar funcionalidades de processamento óptico de sinal necessárias em nós de redes ópticas de nova geração. Nas novas redes ópticas são utilizados formatos de modulação avançados, com gestão da fase, pelo que foi estudado experimentalmente e por simulação o impacto da utilização destes formatos no desempenho do MZI-SOA na conversão de comprimento de onda e formato, em várias condições de operação. Foram derivadas regras de utilização para funcionamento óptimo. Foi também estudado o impacto da forma dos pulsos do sinal no desempenho do dispositivo. De seguida, o MZI-SOA foi utilizado para realizar funcionalidades temporais ao nível do bit e do pacote. Foi investigada a operação de um conversor de multiplexagem por divisão no comprimento de onda para multiplexagem por divisão temporal óptica, experimentalmente e por simulação, e de um compressor e descompressor de pacotes, por simulação. Para este último, foi investigada a operação com o MZI-SOA baseado em amplificadores ópticos de semicondutor com geometria de poço quântico e ponto quântico. Foi também realizado experimentalmente um ermutador de intervalos temporais que explora o MZI-SOA como conversor de comprimento de onda e usa um banco de linhas de atraso ópticas para introduzir no sinal um atraso seleccionável. Por fim, foi estudado analiticamente, experimentalmente e por simulação o impacto de diafonia em redes ópticas em diversas situações. Extendeu-se um modelo analítico de cálculo de desempenho para contemplar sinais distorcidos e afectados por diafonia. Estudou-se o caso de sinais muito filtrados e afectados por diafonia e mostrou-se que, para determinar correctamente as penalidades que ocorrem, ambos os efeitos devem ser considerados simultaneamente e não em separado. Foi estudada a escalabilidade limitada por diafonia de um comutador de intervalos temporais baseado em MZI-SOA a operar como comutador espacial. Mostrou-se também que sinais afectados fortemente por não-linearidades podem causar penalidades de diafonia mais elevadas do que sinais não afectados por não-linearidades. Neste trabalho foi demonstrado que o MZI-SOA permite construir vários e pertinentes circuitos ópticos, funcionando como bloco fundamental de construção, tendo sido o seu desempenho analisado, desde o nível de componente até ao nível de sistema. Tendo em conta as vantagens e desvantagens do MZI-SOA e os desenvolvimentos recentes de outras tecnologias, foram sugeridos tópicos de investigação com o intuito de evoluir para as redes ópticas de nova geração.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Portugal, it was estimated that around 1.95 Mton/year of wood is used in residential wood burning for heating and cooking. Additionally, in the last decades, burnt forest area has also been increasing. These combustions result in high levels of toxic air pollutants and a large perturbation of atmospheric chemistry, interfere with climate and have adverse effects on health. Accurate quantification of the amounts of trace gases and particulate matter emitted from residential wood burning, agriculture and garden waste burning and forest fires on a regional and global basis is essential for various purposes, including: the investigation of several atmospheric processes, the reporting of greenhouse gas emissions, and quantification of the air pollution sources that affect human health at regional scales. In Southern Europe, data on detailed emission factors from biomass burning are rather inexistent. Emission inventories and source apportionment, photochemical and climate change models use default values obtained for US and Northern Europe biofuels. Thus, it is desirable to use more specific locally available data. The objective of this study is to characterise and quantify the contribution of biomass combustion sources to atmospheric trace gases and aerosol concentrations more representative of the national reality. Laboratory (residential wood combustion) and field (agriculture/garden waste burning and experimental wildland fires) sampling experiments were carried out. In the laboratory, after the selection of the most representative wood species and combustion equipment in Portugal, a sampling program to determine gaseous and particulate matter emission rates was set up, including organic and inorganic aerosol composition. In the field, the smoke plumes from agriculture/garden waste and experimental wildland fires were sampled. The results of this study show that the combustion equipment and biofuel type used have an important role in the emission levels and composition. Significant differences between the use of traditional combustion equipment versus modern equipments were also observed. These differences are due to higher combustion efficiency of modern equipment, reflecting the smallest amount of particulate matter, organic carbon and carbon monoxide released. With regard to experimental wildland fires in shrub dominated areas, it was observed that the largest organic fraction in the samples studied was mainly composed by vegetation pyrolysis products. The major organic components in the smoke samples were pyrolysates of vegetation cuticles, mainly comprising steradienes and sterol derivatives, carbohydrates from the breakdown of cellulose, aliphatic lipids from vegetation waxes and methoxyphenols from the lignin thermal degradation. Despite being a banned practice in our country, agriculture/garden waste burning is actually quite common. To assess the particulate matter composition, the smoke from three different agriculture/garden residues have been sampled into 3 different size fractions (PM2.5, PM2.5-10 and PM>10). Despite distribution patterns of organic compounds in particulate matter varied among residues, the amounts of phenolics (polyphenol and guaiacyl derivatives) and organic acids were always predominant over other organic compounds in the organosoluble fraction of smoke. Among biomarkers, levoglucosan, β-sitosterol and phytol were detected in appreciable amounts in the smoke of all agriculture/garden residues. In addition, inositol may be considered as an eventual tracer for the smoke from potato haulm burning. It was shown that the prevailing ambient conditions (such as high humidity in the atmosphere) likely contributed to atmospheric processes (e.g. coagulation and hygroscopic growth), which influenced the particle size characteristics of the smoke tracers, shifting their distribution to larger diameters. An assessment of household biomass consumption was also made through a national scale survey. The information obtained with the survey combined with the databases on emission factors from the laboratory and field tests allowed us to estimate the pollutant amounts emitted in each Portuguese district. In addition to a likely contribution to the improvement of emission inventories, emission factors obtained for tracer compounds in this study can be applied in receptor models to assess the contribution of biomass burning to the levels of atmospheric aerosols and their constituents obtained in monitoring campaigns in Mediterranean Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sempre foi do interesse das instituições financeiras de crédito determinar o risco de incumprimento associado a uma empresa por forma a avaliar o seu perfil. No entanto, esta informação é útil a todos os stakeholders de uma empresa, já que também estes comprometem uma parte de si ao interagirem com esta. O aumento do número de insolvências nos últimos anos tem reafirmado a necessidade de ampliar e aprofundar a pesquisa sobre o stress financeiro. A identificação dos fatores que influenciam a determinação do preço dos ativos sempre foi do interesse de todos os stakeholders, por forma a antecipar a variação dos retornos e agir em sua conformidade. Nesta dissertação será estudada a influência do risco de incumprimento sobre os retornos de capital, usando como indicador do risco de incumprimento a probabilidade de incumprimento obtida segundo o modelo de opções de Merton (1974). Efetuou-se esta análise durante o período de Fevereiro de 2002 a Dezembro de 2011, utilizando dados de empresas Portuguesas, Espanholas e Gregas. Os resultados evidenciam uma relação negativa do risco de incumprimento com os retornos de capital, que é devida a um efeito momentum e à volatilidade. A par disso, também se demonstra que o tamanho e o book-to-market não são representativos do risco de incumprimento na amostra aqui utilizada, ao contrário do que Fama & French (1992; 1996) afirmavam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

No Monumento Nacional aos Combatentes do Ultramar, em Belém, encontram-se dispostos por ano e ordem alfabética os nomes dos militares mortos nesse conflito que durou treze anos. Este enunciado é o ponto de partida para um projeto artístico que, não sendo construído fisicamente a partir de fontes documentais ou de artefactos relacionados com os factos históricos, se irá desenvolver com base em premissas conceptuais no sentido de despoletar a partilha dessa memória. Este projeto artístico é, em si, a criação de um novo documento que olha o passado e o procura projetar no futuro com base no momento “PRESENTE”. Nesta comunicação propomo-nos, metodologicamente, discutir o processo de construção de um projeto artístico que, com a atribuição do prémio Bolsa Estação Imagem | Mora 2014 dará origem a uma exposição pública e à publicação de um livro relacionando-o com um conjunto de possibilidades que questionam as potencialidades que a área da criação artística dispõe para contaminar as questões da musealização de forma a contribuir com o despontar de novas abordagens e narrativas nas práticas da materialização de exposições como médium e lugar de criação artística. Através da consideração processual deste projeto procuramos atingir o significado da memória nos processos de mediação artística onde as imagens renunciando à possibilidade de serem simulacro ou fantasmagoria, simbolizam cada coisa e o seu contrário, abeirando-se da não representação e, neste limite, qual o papel do museu nessas práticas de mediação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article contends that what appear to be the dystopic conditions of affective capitalism are just as likely to be felt in various joyful encounters as they are in atmospheres of fear associated with post 9/11 securitization. Moreover, rather than grasping these joyful encounters with capitalism as an ideological trick working directly on cognitive systems of belief, they are approached here by way of a repressive affective relation a population establishes between politicized sensory environments and what Deleuze and Guattari (1994) call a brain-becoming-subject. This is a radical relationality (Protevi, 2010) understood in this context as a mostly nonconscious brain-somatic process of subjectification occurring in contagious sensory environments populations become politically situated in. The joyful encounter is not therefore merely an ideological manipulation of belief, but following Gabriel Tarde (as developed in Sampson, 2012), belief is always the object of desire. The discussion starts by comparing recent efforts by Facebook to manipulate mass emotional contagion to a Huxleyesque control through appeals to joy. Attention is then turned toward further manifestations of affective capitalism; beginning with the so-called emotional turn in the neurosciences, which has greatly influenced marketing strategies intended to unconsciously influence consumer mood (and choice), and ending with a further comparison between encounters with Nazi joy in the 1930s (Protevi, 2010) and the recent spreading of right wing populism similarly loaded with political affect. Indeed, the dystopian presence of a repressive political affect in all of these examples prompts an initial question concerning what can be done to a brain so that it involuntarily conforms to the joyful encounter. That is to say, what can affect theory say about an apparent brain-somatic vulnerability to affective suggestibility and a tendency toward mass repression? However, the paper goes on to frame a second (and perhaps more significant) question concerning what can a brain do. Through the work of John Protevi (in Hauptmann and Neidich (eds.), 2010: 168-183), Catherine Malabou (2009) and Christian Borch (2005), the article discusses how affect theory can conceive of a brain-somatic relation to sensory environments that might be freed from its coincidence with capitalism. This second question not only leads to a different kind of illusion to that understood as a product of an ideological trick, but also abnegates a model of the brain which limits subjectivity in the making to a phenomenological inner self or Being in the world.