74 resultados para Vehicle routing problems with gains
Resumo:
Aircraft OH and HO2 measurements made over West Africa during the AMMA field campaign in summer 2006 have been investigated using a box model constrained to observations of long-lived species and physical parameters. "Good" agreement was found for HO2 (modelled to observed gradient of 1.23 ± 0.11). However, the model significantly overpredicts OH concentrations. The reasons for this are not clear, but may reflect instrumental instabilities affecting the OH measurements. Within the model, HOx concentrations in West Africa are controlled by relatively simple photochemistry, with production dominated by ozone photolysis and reaction of O(1D) with water vapour, and loss processes dominated by HO2 + HO2 and HO2 + RO2. Isoprene chemistry was found to influence forested regions. In contrast to several recent field studies in very low NOx and high isoprene environments, we do not observe any dependence of model success for HO2 on isoprene and attribute this to efficient recycling of HOx through RO2 + NO reactions under the moderate NOx concentrations (5–300 ppt NO in the boundary layer, median 76 ppt) encountered during AMMA. This suggests that some of the problems with understanding the impact of isoprene on atmospheric composition may be limited to the extreme low range of NOx concentrations.
Resumo:
A new blood clotting response test was used to determine the susceptibility, to coumatetralyl and bromadiolone, of laboratory strains of Norway rat from Germany and the UK (Hampshire), and wild rats trapped on farms in Wales (UK) and Westphalia (Germany). Resistance factors were calculated in relation to the CD strain of Norway rat. An outbred strain of wild rats, raised from rats trapped in Germany, was found to be more susceptible to coumatetralyl by a factor of 0.5-0.6 compared to the CD strain. Homozygous and heterozygous animals of a strain of resistant rats from Westphalia were cross-resistant to coumatetralyl and bromadiolone, with a higher resistance factor for bromadiolone than that found in both UK strains. Our results show that the degree of altered susceptibility and resistance varies between strains of wild rat and between resistance foci. Some wild rat strains may be more susceptible than laboratory rat strains. Even in a well-established resistance area, it may be difficult to find infestations with resistance high enough to suspect control problems with bromadiolone, even after decades of use of this compound.
Resumo:
Graphical tracking is a technique for crop scheduling where the actual plant state is plotted against an ideal target curve which encapsulates all crop and environmental characteristics. Management decisions are made on the basis of the position of the actual crop against the ideal position. Due to the simplicity of the approach it is possible for graphical tracks to be developed on site without the requirement for controlled experimentation. Growth models and graphical tracks are discussed, and an implementation of the Richards curve for graphical tracking described. In many cases, the more intuitively desirable growth models perform sub-optimally due to problems with the specification of starting conditions, environmental factors outside the scope of the original model and the introduction of new cultivars. Accurate specification for a biological model requires detailed and usually costly study, and as such is not adaptable to a changing cultivar range and changing cultivation techniques. Fitting of a new graphical track for a new cultivar can be conducted on site and improved over subsequent seasons. Graphical tracking emphasises the current position relative to the objective, and as such does not require the time consuming or system specific input of an environmental history, although it does require detailed crop measurement. The approach is flexible and could be applied to a variety of specification metrics, with digital imaging providing a route for added value. For decision making regarding crop manipulation from the observed current state, there is a role for simple predictive modelling over the short term to indicate the short term consequences of crop manipulation.
Resumo:
The early eighties saw the introduction of liposomes as skin drug delivery systems, initially promoted primarily for localised effects with minimal systemic delivery. Subsequently, a novel ultradeformable vesicular system (termed "Transfersomes" by the inventors) was reported for transdermal delivery with an efficiency similar to subcutaneous injection. Further research illustrated that the mechanisms of liposome action depended on the application regime and the vesicle composition and morphology. Ethical, health and supply problems with human skin have encouraged researchers to use skin models. 'IYaditional models involved polymer membranes and animal tissue, but whilst of value for release studies, such models are not always good mimics for the complex human skin barrier, particularly with respect to the stratum corneal intercellular lipid domains. These lipids have a multiply bilayered organization, a composition and organization somewhat similar to liposomes, Consequently researchers have used vesicles as skin model membranes. Early work first employed phospholipid liposomes and tested their interactions with skin penetration enhancers, typically using thermal analysis and spectroscopic analyses. Another approach probed how incorporation of compounds into liposomes led to the loss of entrapped markers, analogous to "fluidization" of stratum corneum lipids on treatment with a penetration enhancer. Subsequently scientists employed liposomes formulated with skin lipids in these types of studies. Following a brief description of the nature of the skin barrier to transdermal drug delivery and the use of liposomes in drug delivery through skin, this article critically reviews the relevance of using different types of vesicles as a model for human skin in permeation enhancement studies, concentrating primarily on liposomes after briefly surveying older models. The validity of different types of liposome is considered and traditional skin models are compared to vesicular model membranes for their precision and accuracy as skin membrane mimics. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Rats with fornix transection, or with cytotoxic retrohippocampal lesions that removed entorhinal cortex plus ventral subiculum, performed a task that permits incidental learning about either allocentric (Allo) or egocentric (Ego) spatial cues without the need to navigate by them. Rats learned eight visual discriminations among computer-displayed scenes in a Y-maze, using the constant-negative paradigm. Every discrimination problem included two familiar scenes (constants) and many less familiar scenes (variables). On each trial, the rats chose between a constant and a variable scene, with the choice of the variable rewarded. In six problems, the two constant scenes had correlated spatial properties, either Alto (each constant appeared always in the same maze arm) or Ego (each constant always appeared in a fixed direction from the start arm) or both (Allo + Ego). In two No-Cue (NC) problems, the two constants appeared in randomly determined arms and directions. Intact rats learn problems with an added Allo or Ego cue faster than NC problems; this facilitation provides indirect evidence that they learn the associations between scenes and spatial cues, even though that is not required for problem solution. Fornix and retrohippocampal-lesioned groups learned NC problems at a similar rate to sham-operated controls and showed as much facilitation of learning by added spatial cues as did the controls; therefore, both lesion groups must have encoded the spatial cues and have incidentally learned their associations with particular constant scenes. Similar facilitation was seen in subgroups that had short or long prior experience with the apparatus and task. Therefore, neither major hippocampal input-output system is crucial for learning about allocentric or egocentric cues in this paradigm, which does not require rats to control their choices or navigation directly by spatial cues.
Resumo:
Background: Problems with lexical retrieval are common across all types of aphasia but certain word classes are thought to be more vulnerable in some aphasia types. Traditionally, verb retrieval problems have been considered characteristic of non-fluent aphasias but there is growing evidence that verb retrieval problems are also found in fluent aphasia. As verbs are retrieved from the mental lexicon with syntactic as well as phonological and semantic information, it is speculated that an improvement in verb retrieval should enhance communicative abilities in this population as in others. We report on an investigation into the effectiveness of verb treatment for three individuals with fluent aphasia. Methods & Procedures: Multiple pre-treatment baselines were established over 3 months in order to monitor language change before treatment. The three participants then received twice-weekly verb treatment over approximately 4 months. All pre-treatment assessments were administered immediately after treatment and 3 months post-treatment. Outcome & Results: Scores fluctuated in the pre-treatment period. Following treatment, there was a significant improvement in verb retrieval for two of the three participants on the treated items. The increase in scores for the third participant was statistically nonsignificant but post-treatment scores moved from below the normal range to within the normal range. All participants were significantly quicker in the verb retrieval task following treatment. There was an increase in well-formed sentences in the sentence construction test and in some samples of connected speech. Conclusions: Repeated systematic treatment can produce a significant improvement in verb retrieval of practised items and generalise to unpractised items for some participants. An increase in well-formed sentences is seen for some speakers. The theoretical and clinical implications of the results are discussed.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A numerical scheme is presented for the solution of the Euler equations of compressible flow of a real gas in a single spatial coordinate. This includes flow in a duct of variable cross-section, as well as flow with slab, cylindrical or spherical symmetry, as well as the case of an ideal gas, and can be useful when testing codes for the two-dimensional equations governing compressible flow of a real gas. The resulting scheme requires an average of the flow variables across the interface between cells, and this average is chosen to be the arithmetic mean for computational efficiency, which is in contrast to the usual “square root” averages found in this type of scheme. The scheme is applied with success to five problems with either slab or cylindrical symmetry and for a number of equations of state. The results compare favourably with the results from other schemes.
Resumo:
A numerical scheme is presented for the solution of the Euler equations of compressible flow of a gas in a single spatial co-ordinate. This includes flow in a duct of variable cross-section as well as flow with slab, cylindrical or spherical symmetry and can prove useful when testing codes for the two-dimensional equations governing compressible flow of a gas. The resulting scheme requires an average of the flow variables across the interface between cells and for computational efficiency this average is chosen to be the arithmetic mean, which is in contrast to the usual ‘square root’ averages found in this type of scheme. The scheme is applied with success to five problems with either slab or cylindrical symmetry and a comparison is made in the cylindrical case with results from a two-dimensional problem with no sources.
Resumo:
Background: This study was carried out as part of a European Union funded project (PharmDIS-e+), to develop and evaluate software aimed at assisting physicians with drug dosing. A drug that causes particular problems with drug dosing in primary care is digoxin because of its narrow therapeutic range and low therapeutic index. Objectives: To determine (i) accuracy of the PharmDIS-e+ software for predicting serum digoxin levels in patients who are taking this drug regularly; (ii) whether there are statistically significant differences between predicted digoxin levels and those measured by a laboratory and (iii) whether there are differences between doses prescribed by general practitioners and those suggested by the program. Methods: We needed 45 patients to have 95% Power to reject the null hypothesis that the mean serum digoxin concentration was within 10% of the mean predicted digoxin concentration. Patients were recruited from two general practices and had been taking digoxin for at least 4 months. Exclusion criteria were dementia, low adherence to digoxin and use of other medications known to interact to a clinically important extent with digoxin. Results: Forty-five patients were recruited. There was a correlation of 0·65 between measured and predicted digoxin concentrations (P < 0·001). The mean difference was 0·12 μg/L (SD 0·26; 95% CI 0·04, 0·19, P = 0·005). Forty-seven per cent of the patients were prescribed the same dose as recommended by the software, 44% were prescribed a higher dose and 9% a lower dose than recommended. Conclusion: PharmDIS-e+ software was able to predict serum digoxin levels with acceptable accuracy in most patients.
Resumo:
This article provides a brief critique of a recent article on biomineralisation and preservation. It gives a summary of the difference between biomineralisation and mineral replacement, and addresses problems with the interpretation of FT-IR data. The lack of contextual information for the samples studied is another problem which is highlighted.
Modelling sediment supply and transport in the River Lugg: strategies for controlling sediment loads
Resumo:
The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.
Resumo:
In order to explore the impact of a degraded semantic system on the structure of language production, we analysed transcripts from autobiographical memory interviews to identify naturally-occurring speech errors by eight patients with semantic dementia (SD) and eight age-matched normal speakers. Relative to controls, patients were significantly more likely to (a) substitute and omit open class words, (b) substitute (but not omit) closed class words, (c) substitute incorrect complex morphological forms and (d) produce semantically and/or syntactically anomalous sentences. Phonological errors were scarce in both groups. The study confirms previous evidence of SD patients’ problems with open class content words which are replaced by higher frequency, less specific terms. It presents the first evidence that SD patients have problems with closed class items and make syntactic as well as semantic speech errors, although these grammatical abnormalities are mostly subtle rather than gross. The results can be explained by the semantic deficit which disrupts the representation of a pre-verbal message, lexical retrieval and the early stages of grammatical encoding.
Resumo:
The introduction of non-toxic fluride compounds as direct replacements for Thorium Fluoride (ThF4) has renewed interest in the use of low index fluoride compounds in high performance infrared filters. This paper reports the results of an investigation into the effects of combining these low index materials, particularly Barium Fluoride (BaF2), with the high index material Lead Telluride (PbTe) in bandpass and edge filters. Infrared filter designs using conventional and the new material ombination are compared, and infrared filters using these material combinations have been manufactured and have been shown to suffer problems with residual stress. A possible solution to this problem utilising Zinc Sulphide (ZnS) layers with compensating compressive stress is discussed.