39 resultados para Titration and off-gas analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An important goal in computational neuroanatomy is the complete and accurate simulation of neuronal morphology. We are developing computational tools to model three-dimensional dendritic structures based on sets of stochastic rules. This paper reports an extensive, quantitative anatomical characterization of simulated motoneurons and Purkinje cells. We used several local and global algorithms implemented in the L-Neuron and ArborVitae programs to generate sets of virtual neurons. Parameters statistics for all algorithms were measured from experimental data, thus providing a compact and consistent description of these morphological classes. We compared the emergent anatomical features of each group of virtual neurons with those of the experimental database in order to gain insights on the plausibility of the model assumptions, potential improvements to the algorithms, and non-trivial relations among morphological parameters. Algorithms mainly based on local constraints (e.g., branch diameter) were successful in reproducing many morphological properties of both motoneurons and Purkinje cells (e.g. total length, asymmetry, number of bifurcations). The addition of global constraints (e.g., trophic factors) improved the angle-dependent emergent characteristics (average Euclidean distance from the soma to the dendritic terminations, dendritic spread). Virtual neurons systematically displayed greater anatomical variability than real cells, suggesting the need for additional constraints in the models. For several emergent anatomical properties, a specific algorithm reproduced the experimental statistics better than the others did. However, relative performances were often reversed for different anatomical properties and/or morphological classes. Thus, combining the strengths of alternative generative models could lead to comprehensive algorithms for the complete and accurate simulation of dendritic morphology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Four new cadmium(II) complexes [Cd-2(bz)(4)(H2O)(4)(mu 2-hmt)]center dot Hbz center dot H2O (1), [Cd-3(bz)(6)(H2O)(6)(mu 2-hmt)(2)]center dot 6H(2)O (2), [Cd(pa)(2)(H2O)(mu(2)-hmt)](n) (3), and {[Cd-3(ac)(6)(H2O)(3)(mu(3)-hmt)(2)]center dot 6H(2)O}(n) (4) with hexamine (hmt) and monocarboxylate ions, benzoate (bz), phenylacetate (pa), or acetate (ac) have been synthesized and characterized structurally. Structure determinations reveal that 1 is dinuclear, 2 is trinuclear, 3 is a one-dimensional (1D) infinite chain, and 4 is a two-dimensional (2D) polymer with fused hexagonal rings consisting of Cd-II and hmt. All the Cd-II atoms in the four complexes (except one CdII in 2) possess seven-coordinate pentagonal bipyramidal geometry with the various chelating bidentate carboxylate groups in equatorial sites. One of the CdII ions in 2, a complex that contains two monodentate carboxylates is in a distorted octahedral environment. The bridging mode of hmt is mu 2- in complexes 1-3 but is mu 3- in complex 4. In all complexes, there are significant numbers of H-bonds, C-H/pi, and pi-pi interactions which play crucial roles in forming the supramolecular networks. The importance of the noncovalent interactions in terms of energies and geometries has been analyzed using high level ab initio calculations. The effect of the cadmium coordinated to hmt on the energetic features of the C-H/pi interaction is analyzed. Finally, the interplay between C-H/pi and pi-pi interactions observed in the crystal structure of 3 is also studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dependence of much of Africa on rain fed agriculture leads to a high vulnerability to fluctuations in rainfall amount. Hence, accurate monitoring of near-real time rainfall is particularly useful, for example in forewarning possible crop shortfalls in drought-prone areas. Unfortunately, ground based observations are often inadequate. Rainfall estimates from satellite-based algorithms and numerical model outputs can fill this data gap, however rigorous assessment of such estimates is required. In this case, three satellite based products (NOAA-RFE 2.0, GPCP-1DD and TAMSAT) and two numerical model outputs (ERA-40 and ERA-Interim) have been evaluated for Uganda in East Africa using a network of 27 rain gauges. The study focuses on the years 2001 to 2005 and considers the main rainy season (February to June). All data sets were converted to the same temporal and spatial scales. Kriging was used for the spatial interpolation of the gauge data. All three satellite products showed similar characteristics and had a high level of skill that exceeded both model outputs. ERA-Interim had a tendency to overestimate whilst ERA-40 consistently underestimated the Ugandan rainfall.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The adsorption of carbon monoxide on the Pt{110} surface at coverages of 0.5 ML and 1.0 ML was investigated using quantitative low-energy electron diffraction (LEED IV) and density-functional theory (DFT). At 0.5 ML CO lifts the reconstruction of the clean surface but does not form an ordered overlayer. At the saturation coverage, 1.0 ML, a well-ordered p(2×1) superstructure with glide line symmetry is formed. It was confirmed that the CO molecules adsorb on top of the Pt atoms in the top-most substrate layer with the molecular axes tilted by ±22° with respect to the surface normal in alternating directions away from the close packed rows of Pt atoms. This is accompanied by significant lateral shifts of 0.55 Å away from the atop sites in the same direction as the tilt. The top-most substrate layer relaxes inwards by −4% with respect to the bulk-terminated atom positions, while the consecutive layers only show minor relaxations. Despite the lack of long-range order in the 0.5 ML CO layer it was possible to determine key structural parameters by LEED IV using only the intensities of the integer-order spots. At this coverage CO also adsorbs on atop sites with the molecular axis closer to the surface normal (b10°). The average substrate relaxations in each layer are similar for both coverages and consistent with DFT calculations performed for a variety of ordered structures with coverages of 1.0 ML and 0.5 ML.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper approaches the subject of brand equity measurement on and offline. The existing body of research knowledge on brand equity measurement has derived from classical contexts; however, the majority of today's brands prosper simultaneously online and offline. Since branding on the Web needs to address the unique characteristics of computer-mediated environments, it was posited that classical measures of brand equity were inadequate for this category of brands. Aaker's guidelines for building a brand equity measurement system were thus followed and his brand equity ten was employed as a point of departure. The main challenge was complementing traditional measures of brand equity with new measures pertinent to the Web. Following 16 semi-structured interviews with experts, ten additional measures were identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although most researchers recognise that the language repertoire of bilinguals canmvary, few studies have tried to address variation in bilingual competence in any detail. This study aims to take a first step towards further understanding the way in which bilingual competencies can vary at the level of syntax by comparing the use of syntactic embeddings among three different groups of Turkish�German bilinguals. The approach of the present paper is new in that different groups of bilinguals are compared with each other, and not only with monolingual speakers, as is common in most studies in the field. The analysis focuses on differences in the use of embeddings in Turkish, which are generally considered to be one of the more complex aspects of Turkish grammar. The study shows that young Turkish� German bilingual adults who were born and raised in Germany use fewer, and less complex embeddings than Turkish�German bilingual returnees who had lived in Turkey for eight years at the time of recording. The present study provides new insights in the nature of bilingual competence, as well as a new perspective on syntactic change in immigrant Turkish as spoken in Europe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To describe the training undertaken by pharmacists employed in a pharmacist-led information technology-based intervention study to reduce medication errors in primary care (PINCER Trial), evaluate pharmacists’ assessment of the training, and the time implications of undertaking the training. Methods: Six pharmacists received training, which included training on root cause analysis and educational outreach, to enable them to deliver the PINCER Trial intervention. This was evaluated using self-report questionnaires at the end of each training session. The time taken to complete each session was recorded. Data from the evaluation forms were entered onto a Microsoft Excel spreadsheet, independently checked and the summary of results further verified. Frequencies were calculated for responses to the three-point Likert scale questions. Free-text comments from the evaluation forms and pharmacists’ diaries were analysed thematically. Key findings: All six pharmacists received 22 hours of training over five sessions. In four out of the five sessions, the pharmacists who completed an evaluation form (27 out of 30 were completed) stated they were satisfied or very satisfied with the various elements of the training package. Analysis of free-text comments and the pharmacists’ diaries showed that the principles of root cause analysis and educational outreach were viewed as useful tools to help pharmacists conduct pharmaceutical interventions in both the study and other pharmacy roles that they undertook. The opportunity to undertake role play was a valuable part of the training received. Conclusions: Findings presented in this paper suggest that providing the PINCER pharmacists with training in root cause analysis and educational outreach contributed to the successful delivery of PINCER interventions and could potentially be utilised by other pharmacists based in general practice to deliver pharmaceutical interventions to improve patient safety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider the structure of dynamically evolving networks modelling information and activity moving across a large set of vertices. We adopt the communicability concept that generalizes that of centrality which is defined for static networks. We define the primary network structure within the whole as comprising of the most influential vertices (both as senders and receivers of dynamically sequenced activity). We present a methodology based on successive vertex knockouts, up to a very small fraction of the whole primary network,that can characterize the nature of the primary network as being either relatively robust and lattice-like (with redundancies built in) or relatively fragile and tree-like (with sensitivities and few redundancies). We apply these ideas to the analysis of evolving networks derived from fMRI scans of resting human brains. We show that the estimation of performance parameters via the structure tests of the corresponding primary networks is subject to less variability than that observed across a very large population of such scans. Hence the differences within the population are significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft’s cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A process-based fire regime model (SPITFIRE) has been developed, coupled with ecosystem dynamics in the LPJ Dynamic Global Vegetation Model, and used to explore fire regimes and the current impact of fire on the terrestrial carbon cycle and associated emissions of trace atmospheric constituents. The model estimates an average release of 2.24 Pg C yr−1 as CO2 from biomass burning during the 1980s and 1990s. Comparison with observed active fire counts shows that the model reproduces where fire occurs and can mimic broad geographic patterns in the peak fire season, although the predicted peak is 1–2 months late in some regions. Modelled fire season length is generally overestimated by about one month, but shows a realistic pattern of differences among biomes. Comparisons with remotely sensed burnt-area products indicate that the model reproduces broad geographic patterns of annual fractional burnt area over most regions, including the boreal forest, although interannual variability in the boreal zone is underestimated.