30 resultados para Lower level relaxation
em Aston University Research Archive
Resumo:
Firms worldwide are taking major initiatives to reduce the carbon footprint of their supply chains in response to the growing governmental and consumer pressures. In real life, these supply chains face stochastic and non-stationary demand but most of the studies on inventory lot-sizing problem with emission concerns consider deterministic demand. In this paper, we study the inventory lot-sizing problem under non-stationary stochastic demand condition with emission and cycle service level constraints considering carbon cap-and-trade regulatory mechanism. Using a mixed integer linear programming model, this paper aims to investigate the effects of emission parameters, product- and system-related features on the supply chain performance through extensive computational experiments to cover general type business settings and not a specific scenario. Results show that cycle service level and demand coefficient of variation have significant impacts on total cost and emission irrespective of level of demand variability while the impact of product's demand pattern is significant only at lower level of demand variability. Finally, results also show that increasing value of carbon price reduces total cost, total emission and total inventory and the scope of emission reduction by increasing carbon price is greater at higher levels of cycle service level and demand coefficient of variation. The analysis of results helps supply chain managers to take right decision in different demand and service level situations.
Resumo:
The main argument of this paper is that Natural Language Processing (NLP) does, and will continue to, underlie the Semantic Web (SW), including its initial construction from unstructured sources like the World Wide Web (WWW), whether its advocates realise this or not. Chiefly, we argue, such NLP activity is the only way up to a defensible notion of meaning at conceptual levels (in the original SW diagram) based on lower level empirical computations over usage. Our aim is definitely not to claim logic-bad, NLP-good in any simple-minded way, but to argue that the SW will be a fascinating interaction of these two methodologies, again like the WWW (which has been basically a field for statistical NLP research) but with deeper content. Only NLP technologies (and chiefly information extraction) will be able to provide the requisite RDF knowledge stores for the SW from existing unstructured text databases in the WWW, and in the vast quantities needed. There is no alternative at this point, since a wholly or mostly hand-crafted SW is also unthinkable, as is a SW built from scratch and without reference to the WWW. We also assume that, whatever the limitations on current SW representational power we have drawn attention to here, the SW will continue to grow in a distributed manner so as to serve the needs of scientists, even if it is not perfect. The WWW has already shown how an imperfect artefact can become indispensable.
Resumo:
In data visualization, characterizing local geometric properties of non-linear projection manifolds provides the user with valuable additional information that can influence further steps in the data analysis. We take advantage of the smooth character of GTM projection manifold and analytically calculate its local directional curvatures. Curvature plots are useful for detecting regions where geometry is distorted, for changing the amount of regularization in non-linear projection manifolds, and for choosing regions of interest when constructing detailed lower-level visualization plots.
Resumo:
Reviews of the dyslexia literature often seem to suggest that children with dyslexia perform at a lower level on almost any task. Richards et al. (Dyslexia 2002; 8: 1-8) note the importance of being able to demonstrate dissociations between tasks. However, increasingly elegant experiments, in which dissociations are found, almost inevitably find that the performance of children with dyslexia is lower as tasks become more difficult! By looking for deficits in dyslexia, could we be barking up the wrong tree? A methodological approach for circumventing this potential problem is discussed. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
A key objective of autonomic computing is to reduce the cost and expertise required for the management of complex IT systems. As a growing number of these systems are implemented as hierarchies or federations of lower-level systems, techniques that support the development of autonomic systems of systems are required. This article introduces one such technique, which involves the run-time synthesis of autonomic system connectors. These connectors are specified by means of a new type of autonomic computing policy termed a resource definition policy, and enable the dynamic realisation of collections of collaborating autonomic systems, as envisaged by the original vision of autonomic computing. We propose a framework for the formal specification of autonomic computing policies, and use it to define the new policy type and to describe its application to the development of autonomic system of systems. To validate the approach, we present a sample data-centre application that was built using connectors synthesised from resource-definition policies.
Resumo:
Simplification of texts has traditionally been carried out by replacing words and structures with appropriate semantic equivalents in the learner's interlanguage, omitting whichever items prove intractable, and thereby bringing the language of the original within the scope of the learner's transitional linguistic competence. This kind of simplification focuses mainly on the formal features of language. The simplifier can, on the other hand, concentrate on making explicit the propositional content and its presentation in the original in order to bring what is communicated in the original within the scope of the learner's transitional communicative competence. In this case, simplification focuses on the communicative function of the language. Up to now, however, approaches to the problem of simplification have been mainly concerned with the first kind, using the simplifier’s intuition as to what constitutes difficulty for the learner. There appear to be few objective principles underlying this process. The main aim of this study is to investigate the effect of simplification on the communicative aspects of narrative texts, which includes the manner in which narrative units at higher levels of organisation are structured and presented and also the temporal and logical relationships between lower level structures such as sentences/clauses, with the intention of establishing an objective approach to the problem of simplification based on a set of principled procedures which could be used as a guideline in the simplification of material for foreign students at an advanced level.
Resumo:
Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.
Resumo:
The human NT2.D1 cell line was differentiated to form both a 1:2 co-culture of post-mitotic NT2 neuronal and NT2 astrocytic (NT2.N/A) cells and a pure NT2.N culture. The respective sensitivities to several test chemicals of the NT2.N/A, the NT2.N, and the NT2.D1 cells were evaluated and compared with the CCF-STTG1 astrocytoma cell line, using a combination of basal cytotoxicity and biochemical endpoints. Using the MTT assay, the basal cytotoxicity data estimated the comparative toxicities of the test chemicals (chronic neurotoxin 2,5-hexanedione, cytotoxins 2,3- and 3,4-hexanedione and acute neurotoxins tributyltin- and trimethyltin- chloride) and also provided the non-cytotoxic concentration-range for each compound. Biochemical endpoints examined over the non-cytotoxic range included assays for ATP levels, oxidative status (H2O2 and GSH levels) and caspase-3 levels as an indicator of apoptosis. although the endpoints did not demonstrate the known neurotoxicants to be consistently more toxic to the cell systems with the greatest number of neuronal properties, the NT2 astrocytes appeared to contribute positively to NT2 neuronal health following exposure to all the test chemicals. The NT2.N/A co-culture generally maintained superior ATP and GSH levels and reduced H2O2 levels in comparison with the NT2.N mono-culture. In addition, the pure NT2.N culture showed a significantly lower level of caspase-3 activation compared with the co-culture, suggesting NT2 astrocytes may be important in modulating the mode of cell death following toxic insult. Overall, these studies provide evidence that an in vitro integrated population of post-mitotic human neurons and astrocytes may offer significant relevance to the human in vivo heterogeneous nervous system, when initially screening compounds for acute neurotoxic potential.
Resumo:
In the last few decades, the world has witnessed an enormous growth in the volume of foreign direct investment (FDI). The global stock of FDI reached US$ 7.5 trillion in 2003 and accounted for 11% of world Gross Domestic Product, up from 7% in 1990. The sales of multinational enterprises at around US$ 19 trillion were more than double the level of world exports. Substantial FDI inflows went into transition countries. Inflows into one of the region's largest recipient, the Russian Federation, almost doubled, enabling Russia to become one of the five top FDI destinations in 2005-2006. FDI inflows in Russia have increased almost threefold from 13.6% in 2003 to 35% in 2007. In 2003, these flows were twice greater than those into China; whilst in 2007 they were six times larger. Russia's FDI inflows were also about 2.5 times greater than those of Brazil. Efficient government institutions are argued by many economists to foster FDI and growth as a result. However, the magnitude of this effect has yet to be measured. This thesis takes a Political Economy approach to explore, empirically, the potential impact of malfunctioning governmental institutions, proxied by three indices of perceived corruption, on FDI stocks accumulation/distribution within Russia over the period of 2002-2004. Using a regional data-set it concentrates on three areas relating to FDI. Firstly, it considers the significance, the size and the sign of the impact of perceived corruption on accumulation of FDI stocks within Russia. Secondly, it quantifies the impact of perceived corruption on the volume of FDI stocks simultaneously estimating the impact of the investment in public capital such as telecommunications and transportation networks on FDI in the presence of corruption. In particular, it addresses the question whether more corrupt regions in Russia are also those that could have accumulated more of FDI stocks, and investigates whether those 'more corrupt' regions would have had lower level of public capital investment. Finally, it examines whether decentralisation increases or decreases corruption and whether a larger extent of decentralisation has a positive or negative impact on FDI (stocks). The results of three studies are as follows. Firstly, along with market potential, corruption is found to be one of the key factors in explaining FDI distribution within Russia between 2002 and 2004. Secondly, corruption on average is found to be related to FDI positively suggesting that it may act as speed money: to save their time foreign direct investors might be willing to bribe the regional authorities so to move in front of the bureaucratic lines. Thirdly, although when corruption is controlled for, the impact of the latter on unobservable FDI is found to be on average positive, no association between FDI and public investment is observed with the only exception of transportation infrastructure (i.e., railway). The results might suggest therefore that it is possible that not only regions with high levels of perceived corruption attract more FDI but also that expansions in public capital investments are not accompanied by an increase of the volume of FDI (stocks) in regions with high levels of corruption. This casts some doubt on the productivity of the investment in public capital in these regions as it might be that bureaucrats may prefer to use these infrastructural projects for rent extraction. Finally, we find decentralisation to have a significant and positive impact on both FDI stock accumulation and corruption, suggesting that local governments may spend more on public goods to make the area more attractive to foreign investors but at the same time they may be interested into extracting rents from foreign investors. These results support the idea that the regulation of FDI is associated with and facilitated by a larger public sector, which distorts competition and introduces opportunities for rent-seeking by particular economic and political factors.
Resumo:
An initial aim of this project was to evaluate the conventional techniques used in the analysis of newly prepared environmentally friendly water-borne automotive coatings and compare them with solvent-borne coatings having comparable formulations. The investigation was carried out on microtuned layers as well as on complete automotive multi-layer paint systems. Methods used included the very traditional methods of gloss and hardness and the commonly used photo-oxidation index (from FTIR spectral analysis). All methods enabled the durability to weathering of the automotive coatings to be initially investigated. However, a primary aim of this work was to develop methods for analysing the early stages of chemical and property changes in both the solvent-borne and water-borne coating systems that take place during outdoor natural weathering exposures and under accelerated artificial exposures. This was achieved by using dynamic mechanical analysis (DMA), in both tension mode on the microtomed films (on all depths of the coating systems from the uppermost clear-coat right down to the electron-coat) and bending mode of the full (unmicrotomed) systems, as well as MALDI-Tof analysis on the movement of the stabilisers in the full systems. Changes in glass transition temperature and relative cross-link density were determined after weathering and these were related to changes in the chemistries of the binder systems of the coatings after weathering. Concentration profiles of the UV-stabilisers (UVA and HALS) in the coating systems were analysed as a consequence of migration in the coating systems in separate microtomed layers of the paint samples (depth profiling) after weathering and diffusion co-efficient and solubility parameters were determined for the UV stabilisers in the coating systems. The methods developed were used to determine the various physical and chemical changes that take place during weathering of the different (water-borne and solvent-borne) systems (photoxidation). The solvent-borne formulations showed less changes after weathering (both natural and accelerated) than the corresponding water-borne formulations due to the lower level of cross-links in the binders of the water-borne systems. The silver systems examined were more durable than the blue systems due to the reflecting power of the aluminium and the lower temperature of the silver coatings.
Resumo:
The Irish have been relentlessly racialized in their diaspora settings, yet little historical work engages with “race” to understand Irish history on the island of Ireland. This article provides an interpretation of two key periods of Irish history—the second half of the sixteenth century and the period since 1996—through the lens of racialization. I argue that Ireland's history is exceptional in its capacity to reveal key elements of the history of the development of race as an idea and a set of practices. The English colonization of Ireland was underpinned by a form of racism reliant on linking bodies to unchanging hierarchically stacked cultures, without reference to physical differences. For example, the putative unproductiveness of the Gaelic Irish not only placed them at a lower level of civilization than the industrious English but it also authorizes increasingly draconian ways of dealing with the Irish populace. The period since 1996, during which Ireland has become a country of immigration, illustrates how racism has undergone a transformation into the object of official state policies to eliminate it. Yet it flourishes as part of a globalized set of power relations that has brought immigrants to the developing Irish economy. In response to immigration the state simultaneously exerts neoliberal controls and reduces pathways to citizenship through residence while passing antiracism legislation. Today, the indigenous nomadic Travellers and asylum seekers are the ones that are seen as pathologically unproductive. Irish history thus demonstrates that race is not only about color but also very much about culture. It also illustrates notable elements of the West's journey from racism without race to racism without racists.
Resumo:
By applying regulatory focus theory, this paper investigates the impact of both initial confidence and of exactness of growth expectations on subsequent financial performance of the small firms. Drawing on the unique data set based on the repeated survey design, we make one of the first attempts to explore the complexity of this relationship empirically. Overall the findings suggest that controlling for other relevant factors, including actual growth, the entrepreneurs having higher growth expectations perform significantly better later on in terms of profitability. In addition, education has a strong modifying effect: the impact of high growth expectations on subsequent profit performance is stronger for entrepreneurs with lower level of education.
Resumo:
Astrocytes are essential for neuronal function and survival, so both cell types were included in a human neurotoxicity test-system to assess the protective effects of astrocytes on neurons, compared with a culture of neurons alone. The human NT2.D1 cell line was differentiated to form either a co-culture of post-mitotic NT2.N neuronal (TUJ1, NF68 and NSE positive) and NT2.A astrocytic (GFAP positive) cells (∼2:1 NT2.A:NT2.N), or an NT2.N mono-culture. Cultures were exposed to human toxins, for 4 h at sub-cytotoxic concentrations, in order to compare levels of compromised cell function and thus evidence of an astrocytic protective effect. Functional endpoints examined included assays for cellular energy (ATP) and glutathione (GSH) levels, generation of hydrogen peroxide (H2O2) and caspase-3 activation. Generally, the NT2.N/A co-culture was more resistant to toxicity, maintaining superior ATP and GSH levels and sustaining smaller significant increases in H2O2 levels compared with neurons alone. However, the pure neuronal culture showed a significantly lower level of caspase activation. These data suggest that besides their support for neurons through maintenance of ATP and GSH and control of H2O2 levels, following exposure to some substances, astrocytes may promote an apoptotic mode of cell death. Thus, it appears the use of astrocytes in an in vitro predictive neurotoxicity test-system may be more relevant to human CNS structure and function than neuronal cells alone. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
Metallocene ethylene-1-octene copolymers having different densities and comonomer content ranging from 11 to 36 wt% (m-LLDPE), and a Ziegler copolymer (z-LLDPE) containing the same level of short-chain branching (SCB) corresponding to one of the m-LLDPE polymers, were subjected to extrusion. The effects of temperature (210-285 °C) and multi-pass extrusions (up to five passes) on the rheological and structural characteristics of these polymers were investigated using melt index and capillary rheometry, along with spectroscopic characterisation of the evolution of various products by FTIR, C-NMR and colour measurements. The aim is to develop a better understanding of the effects of processing variables on the structure and thermal degradation of these polymers. Results from rheology show that both extrusion temperature and the amount of comonomer have a significant influence on the polymer melt thermo-oxidative behaviour. At low to intermediate processing temperatures, all m-LLDPE polymers exhibited similar behaviour with crosslinking reactions dominating their thermal oxidation. By contrast, at higher processing temperatures, the behaviour of the metallocene polymers changed depending on the level of comonomer content: higher SCB gave rise to predominantly chain scission reactions whereas polymers with lower level of SCB continued to be dominated by crosslinking. This temperature dependence was attributed to changes in the different evolution of carbonyl and unsaturated compounds including vinyl, vinylidene and trans-vinylene. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
Current tools for assessing risks associated with mental-health problems require assessors to make high-level judgements based on clinical experience. This paper describes how new technologies can enhance qualitative research methods to identify lower-level cues underlying these judgements, which can be collected by people without a specialist mental-health background. Content analysis of interviews with 46 multidisciplinary mental-health experts exposed the cues and their interrelationships, which were represented by a mind map using software that stores maps as XML. All 46 mind maps were integrated into a single XML knowledge structure and analysed by a Lisp program to generate quantitative information about the numbers of experts associated with each part of it. The knowledge was refined by the experts, using software developed in Flash to record their collective views within the XML itself. These views specified how the XML should be transformed by XSLT, a technology for rendering XML, which resulted in a validated hierarchical knowledge structure associating patient cues with risks. Changing knowledge elicitation requirements were accommodated by flexible transformations of XML data using XSLT, which also facilitated generation of multiple data-gathering tools suiting different assessment circumstances and levels of mental-health knowledge. © 2007 Informa UK Ltd All rights reserved.