928 resultados para 3D QSAR, heat of formation, LUMO, antibacterial agent, aryloxazolidinone


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years there has been growing interest in the use of dimethyl ether (DME) as an alternative fuel. In this study, the adsorption of DME on molecular sieves 4Å (Mol4A) and 5Å (Mol5A) has been experimentally investigated using the volumetric adsorption method. Data on the adsorption isotherms, heats of adsorption, and adsorption kinetic have been obtained and used to draw conclusions and compare the performance of the two adsorbents. Within the conditions considered, the adsorption capacity of Mol5A was found to be around eight times higher than the capacity of Mol4A. Low temperature adsorption and thermal pre-treatment of the adsorbents in vacuum were observed to be favourable for increased adsorption capacity. The adsorption isotherms for both adsorbent were fitted to the Freundlich model and the corresponding model parameters are proposed. The adsorption kinetic analysis suggest that the DME adsorption on Mol5A is controlled by intracrystalline diffusion resistance, while on Mol4A it is mainly controlled by surface layering resistance with the diffusion only taking place at the start of adsorption and for a very limited short time. The heats of adsorption were calculated by a calorimetric method based on direct temperature measurements inside the adsorption cell. Isosteric heats, calculated by the thermodynamic approach (Clasius-Clapeyron equation), have consistently shown lower values. The maximum heat of adsorption was found to be 25.9kJmol-1 and 20.1kJmol-1 on Mol4A and Mol5A, respectively; thus indicating a physisorption type of interactions. © 2014 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pre-eclampsia, a pregnancy-specific multi-organ syndrome characterized by widespread endothelial damage, is a new risk factor for cardiovascular disease. No therapies exist to prevent or treat this condition, even to achieve a modest improvement in pregnancy length or birth weight. Co-administration of soluble VEGFR-1 [VEGF (vascular endothelial growth factor) receptor-1; more commonly known as sFlt-1 (soluble Fms-like tyrosine kinase-1)] and sEng (soluble endoglin) to pregnant rats elicits severe pre-eclampsia-like symptoms. These two anti-angiogenic factors are increased dramatically prior to the clinical onset of pre-eclampsia and are quite possibly the 'final common pathway' responsible for the accompanying signs of hypertension and proteinuria as they can be reversed by VEGF administration in animal models. HO-1 (haem oxygenase-1), an anti-inflammatory enzyme, and its metabolite, CO (carbon monoxide), exert protective effects in several organs against oxidative stimuli. In a landmark publication, we showed that the HO-1 pathway inhibits sFlt-1 and sEng in cultured cells and human placental tissue explants. Both CO and NO (nitric oxide) promote vascular homoeostasis and vasodilatation, and activation of VEGFR-1 or VEGFR-2 induced eNOS (endothelial nitric oxide synthase) phosphorylation, NO release and HO-1 expression. Our studies established the HO-1/CO pathway as a negative regulator of cytokine-induced sFlt-1 and sEng release and eNOS as a positive regulator of VEGF-mediated vascular morphogenesis. These findings provide compelling evidence for a protective role of HO-1 in pregnancy and identify it as a target for the treatment of pre-eclampsia. Any agent that is known to up-regulate HO-1, such as statins, may have potential as a therapy. Any intervention achieving even a modest prolongation of pregnancy or amelioration of the condition could have a significant beneficial health impact worldwide.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A high-surface-area silicon oximide-based gel [SiOC(H)=NSi]m[Si2N-C(H)=O]n[SiN(H)-C(H)=O]p[SiOC(H)=NH]q[SiNH]r[SiNH2]s[SiNMe2]t was prepared via a formamide-based aminolysis of tris(dimethylamino)silylamine, (Me2N)3SiNH2. The structure of the gel and the mechanism of formation are elucidated. Pyrolysis of the gel at 1000 °C under N2 flow gave an amorphous microporous oxynitride-based glass with a BET surface area of 195 m2 g−1. © The Royal Society of Chemistry 2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background - MHC Class I molecules present antigenic peptides to cytotoxic T cells, which forms an integral part of the adaptive immune response. Peptides are bound within a groove formed by the MHC heavy chain. Previous approaches to MHC Class I-peptide binding prediction have largely concentrated on the peptide anchor residues located at the P2 and C-terminus positions. Results - A large dataset comprising MHC-peptide structural complexes was created by re-modelling pre-determined x-ray crystallographic structures. Static energetic analysis, following energy minimisation, was performed on the dataset in order to characterise interactions between bound peptides and the MHC Class I molecule, partitioning the interactions within the groove into van der Waals, electrostatic and total non-bonded energy contributions. Conclusion - The QSAR techniques of Genetic Function Approximation (GFA) and Genetic Partial Least Squares (G/PLS) algorithms were used to identify key interactions between the two molecules by comparing the calculated energy values with experimentally-determined BL50 data. Although the peptide termini binding interactions help ensure the stability of the MHC Class I-peptide complex, the central region of the peptide is also important in defining the specificity of the interaction. As thermodynamic studies indicate that peptide association and dissociation may be driven entropically, it may be necessary to incorporate entropic contributions into future calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative structure–activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide–protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2–Db, H2–Kb and H2–Kk. As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The approaches to the analysis of various information resources pertinent to user requirements at a semantic level are determined by the thesauruses of the appropriate subject domains. The algorithms of formation and normalization of the multilinguistic thesaurus, and also methods of their comparison are given.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A feasibility of formation of donor-acceptor charge-transfer (CT) complexes between melanin and 2,4,7-trinitrofluorenone (TNF) being good electron acceptor has been studied in solutions by means of the absorption and photoluminescence (PL) spectra. The model of electronic transitions in a melanin-TNF composite solution has been proposed. © 2014 Copyright Taylor & Francis Group, LLC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology discloses man’s mode of dealing with Nature, the process of production by which he sustains his life, and thereby also lays bare the mode of formation of his social relations, and of the mental conceptions that flow from them (Marx, 1990: 372) My thesis is a Sociological analysis of UK policy discourse for educational technology during the last 15 years. My framework is a dialogue between the Marxist-based critical social theory of Lieras and a corpus-based Critical Discourse Analysis (CDA) of UK policy for Technology Enhanced Learning (TEL) in higher education. Embedded in TEL is a presupposition: a deterministic assumption that technology has enhanced learning. This conceals a necessary debate that reminds us it is humans that design learning, not technology. By omitting people, TEL provides a vehicle for strong hierarchical or neoliberal, agendas to make simplified claims politically, in the name of technology. My research has two main aims: firstly, I share a replicable, mixed methodological approach for linguistic analysis of the political discourse of TEL. Quantitatively, I examine patterns in my corpus to question forms of ‘use’ around technology that structure a rigid basic argument which ‘enframes’ educational technology (Heidegger, 1977: 38). In a qualitative analysis of findings, I ask to what extent policy discourse evaluates technology in one way, to support a Knowledge Based Economy (KBE) in a political economy of neoliberalism (Jessop 2004, Fairclough 2006). If technology is commodified as an external enhancement, it is expected to provide an ‘exchange value’ for learners (Marx, 1867). I therefore examine more closely what is prioritised and devalued in these texts. Secondly, I disclose a form of austerity in the discourse where technology, as an abstract force, undertakes tasks usually ascribed to humans (Lieras, 1996, Brey, 2003:2). This risks desubjectivisation, loss of power and limits people’s relationships with technology and with each other. A view of technology in political discourse as complete without people closes possibilities for broader dialectical (Fairclough, 2001, 2007) and ‘convivial’ (Illich, 1973) understandings of the intimate, material practice of engaging with technology in education. In opening the ‘black box’ of TEL via CDA I reveal talking points that are otherwise concealed. This allows me as to be reflexive and self-critical through praxis, to confront my own assumptions about what the discourse conceals and what forms of resistance might be required. In so doing, I contribute to ongoing debates about networked learning, providing a context to explore educational technology as a technology, language and learning nexus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Against a backdrop of ongoing educational reforms that seek to introduce Communicative Language Teaching (CLT) in Albanian primary and secondary state schools, Albanian teachers, among others, are officially required to use communication-based textbooks in their classes. Authorities in a growing number of countries that are seeking to improve and westernise their educational systems are also using communication-based textbooks as agents of change. Behind these actions, there is the commonly held belief that textbooks can be used to support teacher learning as they provide a visible framework teachers can follow. Communication-based textbooks are used in thousands of EFL classrooms around the world to help teachers to “fully understand and routinize change” (Hutchinson and Torres, 1994:323). However, empirical research on the role materials play in the classroom, and in particular the role of textbook as an agent of change, is still very little, and what does exist is rather inconclusive. This study aims to fulfill this gap. It is predominately a qualitative investigation into how and why four Albanian EFL teachers use Western teaching resources in their classes. Aiming at investigating the decision-making processes that teachers go through in their teaching, and specifically at investigating the relationship between Western-published textbooks, teachers’ decision making, and teachers’ classroom delivery, the current study contributes to an extensive discussion on the development of communicative L2 teaching concepts and methods, teacher decision making, as well as a growing discussion on how best to make institutional reforms effective, particularly in East-European ex-communist countries and in other developing countries. Findings from this research indicate that, prompted by the content of Western-published textbooks, the four research participants, who had received little formal training in CLT teaching, accommodated some communicative teaching behaviours into their teaching. The use of communicative textbooks, however, does not seem to account for radical, methodological changes in teachers’ practices. Teacher cognitions based on teachers’ previous learning experience are likely to act as a lens through which teachers judge classroom realities. As such, they shape, to a great degree, the decisions teachers make regarding the use of Western-published textbooks in their classes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper offers a methodological approach towards the estimation and definition of enthalpies constituting an energy balance around a fast pyrolysis experiment conducted in a laboratory scale fluid bed with a capacity of 1 kg/ h. Pure N2 was used as fluidization medium at atmospheric pressure and the operating temperature (∼500°C) was adjusted with electrical resistors. The biomass feedstock type that was used was beech wood. An effort was made to achieve a satisfying 92.5% retrieval of products (dry basis mass balance) with the differences mainly attributed to loss of some bio-oil constituents into the quenching medium, ISOPAR™. The chemical enthalpy recovery for bio-oil, char and permanent gases is calculated 64.6%, 14.5% and 7.1%, respectively. All the energy losses from the experimental unit into the environment, namely the pyrolyser, cooling unit etc. are discussed and compared to the heat of fast pyrolysis that was calculated at 1123.5 kJ per kg of beech wood. This only represents 2.4% of the biomass total enthalpy or 6.5% its HHV basis. For the estimation of some important thermo-physical properties such as heat capacity and density, it was found that using data based on the identified compounds from the GC/MS analysis is very close to the reference values despite the small fraction of the bio-oil components detected. The methodology and results can help as a starting point for the proper design of fast pyrolysis experiments, pilot and/or industrial scale plants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio frequency identification (RFID) technology has gained increasing popularity in businesses to improve operational efficiency and maximise costs saving. However, there is a gap in the literature exploring the enhanced use of RFID to substantially add values to the supply chain operations, especially beyond what the RFID vendors could offer. This paper presents a multi-agent system, incorporating RFID technology, aimed at fulfilling the gap. The system is developed to model supply chain activities (in particular, logistics operations) and is comprised of autonomous and intelligent agents representing the key entities in the supply chain. With the advanced characteristics of RFID incorporated, the agent system examines ways logistics operations (i.e. distribution network) particular) can be efficiently reconfigured and optimised in response to dynamic changes in the market, production and at any stage in the supply chain. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Agency costs are said to arise as a result of the separation of ownership from control inherent in the corporate form of ownership. One such agency problem concerns the potential variance between the time horizons of principal shareholders and agent managers. Agency theory suggests that these costs can be alleviated or controlled through performance-based Chief Executive Officer (CEO) contracting. However, components of a CEO's compensation contract can exacerbate or mitigate agency-related problems (Antle and Smith, 1985). According to the horizon hypothesis, a self-serving CEO reduces discretionary research and development (R&D) expenditures to increase earnings and earnings-based bonus compensation. Agency theorists contend that a CEO's market-based compensation component can mitigate horizon problems. This study seeks to determine whether there is a relationship between CEO earnings- and market-based compensation components and R&D expenditures in the largest United States industrial firms from 1987 to 1993.^ Consistent with the horizon hypothesis, results provide evidence of a negative and statistically significant relationship between CEO cash compensation (i.e., salary and bonus) and the firm's R&D expenditures. Consistent with the expectations of agency theory, results provide evidence of a positive and statistically significant relationship between market-based CEO compensation and R&D.^ Further results of this study provide evidence of a positive and statistically significant relationship between CEO tenure and the firm's R&D expenditures. Although there is a negative relationship between CEO age and the firm's R&D, it was not statistically significant at the 0.5 level. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phenomenonal growth of the Internet has connected us to a vast amount of computation and information resources around the world. However, making use of these resources is difficult due to the unparalleled massiveness, high communication latency, share-nothing architecture and unreliable connection of the Internet. In this dissertation, we present a distributed software agent approach, which brings a new distributed problem-solving paradigm to the Internet computing researches with enhanced client-server scheme, inherent scalability and heterogeneity. Our study discusses the role of a distributed software agent in Internet computing and classifies it into three major categories by the objects it interacts with: computation agent, information agent and interface agent. The discussion of the problem domain and the deployment of the computation agent and the information agent are presented with the analysis, design and implementation of the experimental systems in high performance Internet computing and in scalable Web searching. ^ In the computation agent study, high performance Internet computing can be achieved with our proposed Java massive computation agent (JAM) model. We analyzed the JAM computing scheme and built a brutal force cipher text decryption prototype. In the information agent study, we discuss the scalability problem of the existing Web search engines and designed the approach of Web searching with distributed collaborative index agent. This approach can be used for constructing a more accurate, reusable and scalable solution to deal with the growth of the Web and of the information on the Web. ^ Our research reveals that with the deployment of the distributed software agent in Internet computing, we can have a more cost effective approach to make better use of the gigantic scale network of computation and information resources on the Internet. The case studies in our research show that we are now able to solve many practically hard or previously unsolvable problems caused by the inherent difficulties of Internet computing. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The detailed organic composition of atmospheric fine particles with an aerodynamic diameter smaller than or equal to 2.5 micrometers (PM2.5) is an integral part of the knowledge needed in order to fully characterize its sources and transformation in the environment. For the study presented here, samples were collected at 3-hour intervals. This high time resolution allows gaining unique insights on the influence of short- and long-range transport phenomena, and dynamic atmospheric processes. A specially designed sequential sampler was deployed at the 2002-2003 Baltimore PM-Supersite to collect PM2.5 samples at a 3-hourly resolution for extended periods of consecutive days, during both summer and winter seasons. Established solvent-extraction and GC-MS techniques were used to extract and analyze the organic compounds in 119 samples from each season. Over 100 individual compounds were quantified in each sample. For primary organics, averaging the diurnal ambient concentrations over the sampled periods revealed ambient patterns that relate to diurnal emission patterns of major source classes. Several short-term releases of pollutants from local sources were detected, and local meteorological data was used to pinpoint possible source regions. Biogenic secondary organic compounds were detected as well, and possible mechanisms of formation were evaluated. The relationships between the observed continuous variations of the concentrations of selected organic markers and both the on-site meteorological measurements conducted parallel to the PM2.5 sampling, and the synoptic patterns of weather and wind conditions were also examined. Several one-to-two days episodes were identified from the sequential variation of the concentration observed for specific marker compounds and markers ratios. The influence of the meteorological events on the concentrations of the organic compounds during selected episodes was discussed. It was observed that during the summer, under conditions of pervasive influence of air masses originated from the west/northwest, some organic species displayed characteristics consistent with the measured PM2.5 being strongly influenced by the aged nature of these long-traveling background parcels. During the winter, intrusions from more regional air masses originating from the south and the southwest were more important.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on theoretical considerations an explanation for the temperature dependence of the thermal expansion and the bulk modulus is proposed. A new equation state is also derived. Additionally a physical explanation for the latent heat of fusion is presented. These theoretical predictions are tested against experiments on highly symmetrical monatomic structures. ^ The volume is not an independent variable and must be broken down into its fundamental components when the relationships to the pressure and temperature are defined. Using zero pressure and temperature reference frame, the initial parameters, volume at zero pressure and temperature[V°], bulk modulus at zero temperature [K°] and volume coefficient of thermal expansion at zero pressure[α°] are defined. ^ The new derived EoS is tested against the experiments on perovskite and epsilon iron. The Root-mean-square-deviations (RMSD) of the residuals of the molar volume, pressure, and temperature are in the range of the uncertainty of the experiments. ^ Separating the experiments into 200 K ranges, the new EoS was compared to the most widely used finite strain, interatomic potential, and empirical isothermal EoSs such as the Burch-Murnaghan, the Vinet, and the Roy-Roy respectively. Correlation coefficients, RMSD's of the residuals, and Akaike Information Criteria were used for evaluating the fitting. Based on these fitting parameters, the new p-V-T EoS is superior in every temperature range relative to the investigated conventional isothermal EoS. ^ The new EoS for epsilon iron reproduces the preliminary-reference earth-model (PREM) densities at 6100-7400 K indicating that the presence of light elements might not be necessary to explain the Earth's inner core densities. ^ It is suggested that the latent heat of fusion supplies the energy required for overcoming on the viscous drag resistance of the atoms. The calculated energies for melts formed from highly symmetrical packing arrangements correlate very well with experimentally determined latent heat values. ^ The optical investigation of carhonado-diamond is also part of the dissertation. The collected first complete infrared FTIR absorption spectra for carhonado-diamond confirm the interstellar origin for the most enigmatic diamonds known as carbonado. ^