866 resultados para Vehicle routing problems with gains
Resumo:
In the U.K., dental students require to perform training and practice on real human tissues at the very early stage of their courses. Currently, the human tissues, such as decayed teeth, are mounted in a human head like physical model. The problems with these models in teaching are; (1) every student operates on tooth, which are always unique; (2) the process cannot be recorded for examination purposes and (3) same training are not repeatable. The aim of the PHATOM Project is to develop a dental training system using Haptic technology. This paper documents the project background, specification, research and development of the first prototype system. It also discusses the research in the visual display, haptic devices and haptic rendering. This includes stereo vision, motion parallax, volumetric modelling, surface remapping algorithms as well as analysis design of the system. A new volumetric to surface model transformation algorithm is also introduced. This paper includes the future work on the system development and research.
The impact of deformation strain on the formation of banded clouds in idealized modeling experiments
Resumo:
Experiments are performed using an idealized version of an operational forecast model to determine the impact on banded frontal clouds of the strength of deformational forcing, low-level baroclinicity, and model representation of convection. Line convection is initiated along the front, and slantwise bands extend from the top of the line-convection elements into the cold air. This banding is attributed primarily to M adjustment. The cross-frontal spreading of the cold pool generated by the line convection leads to further triggering of upright convection in the cold air that feeds into these slantwise bands. Secondary low-level bands form later in the simulations; these are attributed to the release of conditional symmetric instability. Enhanced deformation strain leads to earlier onset of convection and more coherent line convection. A stronger cold pool is generated, but its speed is reduced relative to that seen in experiments with weaker deformational strain, because of inhibition by the strain field. Enhanced low-level baroclinicity leads to the generation of more inertial instability by line convection (for a given capping height of convection), and consequently greater strength of the slantwise circulations formed by M adjustment. These conclusions are based on experiments without a convective-parametrization scheme. Experiments using the standard or a modified scheme for this model demonstrate known problems with the use of this scheme at the awkward 4 km grid length used in these simulations. Copyright © 2008 Royal Meteorological Society
Resumo:
Electrospun fibres based on polypyrrole have been prepared. The incorporation of preformed polypyrrole into fibres electrospun from a carrier polymer can only be achieved when materials are prepared with particulates smaller than the cross-section of the fibre; even so there are some problems, with the substantial loss of material from the electrode tip. As an alternative approach, soluble polypyrroles can be prepared but these are not of sufficient viscosity to prepare electrospun fibres, once again a carrier polymer must be employed. More effective loadings are gained by the process of coating the outer surface of a pre-spun fibre; in this way electrospun fibres coated with polypyrrole can be prepared. This approach has been adapted to produce silver coated polymer fibres by the use of copolymers of styrene and 3-vinyl benzaldehyde.
Resumo:
This paper considers the potential contribution of secondary quantitative analyses of large scale surveys to the investigation of 'other' childhoods. Exploring other childhoods involves investigating the experience of young people who are unequally positioned in relation to multiple, embodied, identity locations, such as (dis)ability, 'class', gender, sexuality, ethnicity and race. Despite some possible advantages of utilising extensive databases, the paper outlines a number of methodological problems with existing surveys which tend to reinforce adultist and broader hierarchical social relations. It is contended that scholars of children's geographies could overcome some of these problematic aspects of secondary data sources by endeavouring to transform the research relations of large scale surveys. Such endeavours would present new theoretical, ethical and methodological complexities, which are briefly considered.
Resumo:
Partnerships are complex, diverse and subtle relationships, the nature of which changes with time, but they are vital for the functioning of the development chain. This paper reviews the meaning of partnership between development institutions as well as some of the main approaches taken to analyse the relationships. The latter typically revolve around analyses based on power, discourse, interdependence and functionality. The paper makes the case for taking a multianalytical approach to understanding partnership but points out three problem areas: identifying acceptable/unacceptable trade-offs between characteristics of partnership, the analysis of multicomponent partnerships (where one partner has a number of other partners) and the analysis of long-term partnership. The latter is especially problematic for long-term partnerships between donors and field agencies that share an underlying commitment based on religious beliefs. These problems with current methods of analysing partnership are highlighted by focusing upon the Catholic Church-based development chain, linking donors in the North (Europe) and their field partners in the South (Abuja Ecclesiastical Province, Nigeria). It explores a narrated history of a relationship with a single donor spanning 35 years from the perspective of one partner (the field agency).
Resumo:
Ozone and temperature profiles from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) have been assimilated, using three-dimensional variational assimilation, into a stratosphere troposphere version of the Met Office numerical weather-prediction system. Analyses are made for the month of September 2002, when there was an unprecedented split in the southern hemisphere polar vortex. The analyses are validated against independent ozone observations from sondes, limb-occultation and total column ozone satellite instruments. Through most of the stratosphere, precision varies from 5 to 15%, and biases are 15% or less of the analysed field. Problems remain in the vortex and below the 60 hPa. level, especially at the tropopause where the analyses have too much ozone and poor agreement with independent data. Analysis problems are largely a result of the model rather than the data, giving confidence in the MIPAS ozone retrievals, though there may be a small high bias in MIPAS ozone in the lower stratosphere. Model issues include an excessive Brewer-Dobson circulation, which results both from known problems with the tracer transport scheme and from the data assimilation of dynamical variables. The extreme conditions of the vortex split reveal large differences between existing linear ozone photochemistry schemes. Despite these issues, the ozone analyses are able to successfully describe the ozone hole split and compare well to other studies of this event. Recommendations are made for the further development of the ozone assimilation system.
Resumo:
UK commercial property lease structures have come under considerable scrutiny during the past decade since the property crash of the early 1990s. In particular, tenants complained that the system was unfair and that it has blocked business change. Government is committed, through its 2001 election manifesto, to promote flexibility and choice in the commercial property lettings market and a new voluntary Commercial Leases Code of Practice was launched in April 2002. This paper investigates whether occupiers are being offered the leases they require or whether there is a mismatch between occupier requirements and actual leases in the market. It draws together the substantial data now available on the actual terms of leases in the UK and surveys of corporate occupiers' attitude to their occupation requirements. Although the data indicated that UK leases have become shorter and more diverse since 1990, this is still not sufficient to meet the current requirements of many corporate occupiers. It is clear that the inability to manage entry and exit strategies is a major concern to occupiers. Lease length is the primary concern of tenants and a number of respondents comment on the mismatch between lease length in the UK and business planning horizons. The right to break and other problems with alienation clauses also pose serious difficulties for occupiers, thus reinforcing the mismatch. Other issues include repairing and insuring clauses and the type of review clause. There are differences in opinion between types of occupier. In particular, international corporate occupiers are significantly more concerned about the length of lease and the incidence of break clauses than national occupiers and private-sector tenants are significantly more concerned about leasing in general than public-sector occupiers. Proposed solutions by tenants are predictable and include shorter leases, more frequent breaks and relaxation of restrictions concerning alienation and other clauses. A significant number specify that they would pay more for shorter leases and other improved terms. Short leases would make many of the other terms more acceptable and this is why they are the main concern of corporate occupiers. Overall, the evidence suggests that there continues to be a gap between occupiers' lease requirements and those currently offered by the market. There are underlying structural factors that act as an inertial force on landlords and inhibit the changes which occupiers appear to want. Nevertheless, the findings raise future research questions concerning whether UK lease structures are a constraining factor on UK competitiveness.
Resumo:
We have designed a highly parallel design for a simple genetic algorithm using a pipeline of systolic arrays. The systolic design provides high throughput and unidirectional pipelining by exploiting the implicit parallelism in the genetic operators. The design is significant because, unlike other hardware genetic algorithms, it is independent of both the fitness function and the particular chromosome length used in a problem. We have designed and simulated a version of the mutation array using Xilinix FPGA tools to investigate the feasibility of hardware implementation. A simple 5-chromosome mutation array occupies 195 CLBs and is capable of performing more than one million mutations per second. I. Introduction Genetic algorithms (GAs) are established search and optimization techniques which have been applied to a range of engineering and applied problems with considerable success [1]. They operate by maintaining a population of trial solutions encoded, using a suitable encoding scheme.
Resumo:
Aircraft OH and HO2 measurements made over West Africa during the AMMA field campaign in summer 2006 have been investigated using a box model constrained to observations of long-lived species and physical parameters. "Good" agreement was found for HO2 (modelled to observed gradient of 1.23 ± 0.11). However, the model significantly overpredicts OH concentrations. The reasons for this are not clear, but may reflect instrumental instabilities affecting the OH measurements. Within the model, HOx concentrations in West Africa are controlled by relatively simple photochemistry, with production dominated by ozone photolysis and reaction of O(1D) with water vapour, and loss processes dominated by HO2 + HO2 and HO2 + RO2. Isoprene chemistry was found to influence forested regions. In contrast to several recent field studies in very low NOx and high isoprene environments, we do not observe any dependence of model success for HO2 on isoprene and attribute this to efficient recycling of HOx through RO2 + NO reactions under the moderate NOx concentrations (5–300 ppt NO in the boundary layer, median 76 ppt) encountered during AMMA. This suggests that some of the problems with understanding the impact of isoprene on atmospheric composition may be limited to the extreme low range of NOx concentrations.
Resumo:
A new blood clotting response test was used to determine the susceptibility, to coumatetralyl and bromadiolone, of laboratory strains of Norway rat from Germany and the UK (Hampshire), and wild rats trapped on farms in Wales (UK) and Westphalia (Germany). Resistance factors were calculated in relation to the CD strain of Norway rat. An outbred strain of wild rats, raised from rats trapped in Germany, was found to be more susceptible to coumatetralyl by a factor of 0.5-0.6 compared to the CD strain. Homozygous and heterozygous animals of a strain of resistant rats from Westphalia were cross-resistant to coumatetralyl and bromadiolone, with a higher resistance factor for bromadiolone than that found in both UK strains. Our results show that the degree of altered susceptibility and resistance varies between strains of wild rat and between resistance foci. Some wild rat strains may be more susceptible than laboratory rat strains. Even in a well-established resistance area, it may be difficult to find infestations with resistance high enough to suspect control problems with bromadiolone, even after decades of use of this compound.
Resumo:
Graphical tracking is a technique for crop scheduling where the actual plant state is plotted against an ideal target curve which encapsulates all crop and environmental characteristics. Management decisions are made on the basis of the position of the actual crop against the ideal position. Due to the simplicity of the approach it is possible for graphical tracks to be developed on site without the requirement for controlled experimentation. Growth models and graphical tracks are discussed, and an implementation of the Richards curve for graphical tracking described. In many cases, the more intuitively desirable growth models perform sub-optimally due to problems with the specification of starting conditions, environmental factors outside the scope of the original model and the introduction of new cultivars. Accurate specification for a biological model requires detailed and usually costly study, and as such is not adaptable to a changing cultivar range and changing cultivation techniques. Fitting of a new graphical track for a new cultivar can be conducted on site and improved over subsequent seasons. Graphical tracking emphasises the current position relative to the objective, and as such does not require the time consuming or system specific input of an environmental history, although it does require detailed crop measurement. The approach is flexible and could be applied to a variety of specification metrics, with digital imaging providing a route for added value. For decision making regarding crop manipulation from the observed current state, there is a role for simple predictive modelling over the short term to indicate the short term consequences of crop manipulation.
Resumo:
The early eighties saw the introduction of liposomes as skin drug delivery systems, initially promoted primarily for localised effects with minimal systemic delivery. Subsequently, a novel ultradeformable vesicular system (termed "Transfersomes" by the inventors) was reported for transdermal delivery with an efficiency similar to subcutaneous injection. Further research illustrated that the mechanisms of liposome action depended on the application regime and the vesicle composition and morphology. Ethical, health and supply problems with human skin have encouraged researchers to use skin models. 'IYaditional models involved polymer membranes and animal tissue, but whilst of value for release studies, such models are not always good mimics for the complex human skin barrier, particularly with respect to the stratum corneal intercellular lipid domains. These lipids have a multiply bilayered organization, a composition and organization somewhat similar to liposomes, Consequently researchers have used vesicles as skin model membranes. Early work first employed phospholipid liposomes and tested their interactions with skin penetration enhancers, typically using thermal analysis and spectroscopic analyses. Another approach probed how incorporation of compounds into liposomes led to the loss of entrapped markers, analogous to "fluidization" of stratum corneum lipids on treatment with a penetration enhancer. Subsequently scientists employed liposomes formulated with skin lipids in these types of studies. Following a brief description of the nature of the skin barrier to transdermal drug delivery and the use of liposomes in drug delivery through skin, this article critically reviews the relevance of using different types of vesicles as a model for human skin in permeation enhancement studies, concentrating primarily on liposomes after briefly surveying older models. The validity of different types of liposome is considered and traditional skin models are compared to vesicular model membranes for their precision and accuracy as skin membrane mimics. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Rats with fornix transection, or with cytotoxic retrohippocampal lesions that removed entorhinal cortex plus ventral subiculum, performed a task that permits incidental learning about either allocentric (Allo) or egocentric (Ego) spatial cues without the need to navigate by them. Rats learned eight visual discriminations among computer-displayed scenes in a Y-maze, using the constant-negative paradigm. Every discrimination problem included two familiar scenes (constants) and many less familiar scenes (variables). On each trial, the rats chose between a constant and a variable scene, with the choice of the variable rewarded. In six problems, the two constant scenes had correlated spatial properties, either Alto (each constant appeared always in the same maze arm) or Ego (each constant always appeared in a fixed direction from the start arm) or both (Allo + Ego). In two No-Cue (NC) problems, the two constants appeared in randomly determined arms and directions. Intact rats learn problems with an added Allo or Ego cue faster than NC problems; this facilitation provides indirect evidence that they learn the associations between scenes and spatial cues, even though that is not required for problem solution. Fornix and retrohippocampal-lesioned groups learned NC problems at a similar rate to sham-operated controls and showed as much facilitation of learning by added spatial cues as did the controls; therefore, both lesion groups must have encoded the spatial cues and have incidentally learned their associations with particular constant scenes. Similar facilitation was seen in subgroups that had short or long prior experience with the apparatus and task. Therefore, neither major hippocampal input-output system is crucial for learning about allocentric or egocentric cues in this paradigm, which does not require rats to control their choices or navigation directly by spatial cues.
Resumo:
Background: Problems with lexical retrieval are common across all types of aphasia but certain word classes are thought to be more vulnerable in some aphasia types. Traditionally, verb retrieval problems have been considered characteristic of non-fluent aphasias but there is growing evidence that verb retrieval problems are also found in fluent aphasia. As verbs are retrieved from the mental lexicon with syntactic as well as phonological and semantic information, it is speculated that an improvement in verb retrieval should enhance communicative abilities in this population as in others. We report on an investigation into the effectiveness of verb treatment for three individuals with fluent aphasia. Methods & Procedures: Multiple pre-treatment baselines were established over 3 months in order to monitor language change before treatment. The three participants then received twice-weekly verb treatment over approximately 4 months. All pre-treatment assessments were administered immediately after treatment and 3 months post-treatment. Outcome & Results: Scores fluctuated in the pre-treatment period. Following treatment, there was a significant improvement in verb retrieval for two of the three participants on the treated items. The increase in scores for the third participant was statistically nonsignificant but post-treatment scores moved from below the normal range to within the normal range. All participants were significantly quicker in the verb retrieval task following treatment. There was an increase in well-formed sentences in the sentence construction test and in some samples of connected speech. Conclusions: Repeated systematic treatment can produce a significant improvement in verb retrieval of practised items and generalise to unpractised items for some participants. An increase in well-formed sentences is seen for some speakers. The theoretical and clinical implications of the results are discussed.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.