916 resultados para usage-based
Resumo:
Previously we described a heterosexual outbreak of HIV-1 subtype B in a town in the north of England (Doncaster) where 11 of 13 infections were shown to be linked by phylogenetic analysis of the env gp120 region. The 11 infections were related to a putative index case, Don1, and further divided into two groups based on the patients' disease status, their viral sequences, and other epidemiological information. Here we describe two further findings. First, we found that viral isolates and gp120 recombinant viruses derived from patients from one group used the CCR5 coreceptor, whereas viruses from the other group could use both the CCR5 and CXCR4 coreceptors. Patients with the X4/R5 dual tropic strains were symptomatic when diagnosed and progressed rapidly, in contrast to the other patient group that has remained asymptomatic, implying a link between the tropism of the strains and disease outcome. Second, we present additional sequence data derived from the index case, demonstrating the presence of sequences from both clades, with an average interclade distance of 9.56%, providing direct evidence of a genetic link between these two groups. This new study shows that Don1 harbored both strains, implying he was either dually infected or that over time intrahost diversification from the R5 to R5/X4 phenotype occurred. These events may account for/have led to the spread of two genetically related strains with different pathogenic properties within the same heterosexual community.
Resumo:
This paper presents a queue-based agent architecture for multimodal interfaces. Using a novel approach to intelligently organise both agents and input data, this system has the potential to outperform current state-of-the-art multimodal systems, while at the same time allowing greater levels of interaction and flexibility. This assertion is supported by simulation test results showing that significant improvements can be obtained over normal sequential agent scheduling architectures. For real usage, this translates into faster, more comprehensive systems, without the limited application domain that restricts current implementations.
Resumo:
There is a growing concern in reducing greenhouse gas emissions all over the world. The U.K. has set 34% target reduction of emission before 2020 and 80% before 2050 compared to 1990 recently in Post Copenhagen Report on Climate Change. In practise, Life Cycle Cost (LCC) and Life Cycle Assessment (LCA) tools have been introduced to construction industry in order to achieve this such as. However, there is clear a disconnection between costs and environmental impacts over the life cycle of a built asset when using these two tools. Besides, the changes in Information and Communication Technologies (ICTs) lead to a change in the way information is represented, in particular, information is being fed more easily and distributed more quickly to different stakeholders by the use of tool such as the Building Information Modelling (BIM), with little consideration on incorporating LCC and LCA and their maximised usage within the BIM environment. The aim of this paper is to propose the development of a model-based LCC and LCA tool in order to provide sustainable building design decisions for clients, architects and quantity surveyors, by then an optimal investment decision can be made by studying the trade-off between costs and environmental impacts. An application framework is also proposed finally as the future work that shows how the proposed model can be incorporated into the BIM environment in practise.
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
This paper demonstrates that the use of GARCH-type models for the calculation of minimum capital risk requirements (MCRRs) may lead to the production of inaccurate and therefore inefficient capital requirements. We show that this inaccuracy stems from the fact that GARCH models typically overstate the degree of persistence in return volatility. A simple modification to the model is found to improve the accuracy of MCRR estimates in both back- and out-of-sample tests. Given that internal risk management models are currently in widespread usage in some parts of the world (most notably the USA), and will soon be permitted for EC banks and investment firms, we believe that our paper should serve as a valuable caution to risk management practitioners who are using, or intend to use this popular class of models.
Resumo:
Biomass is an important source of energy in Thailand and is currently the main renewable energy source, accounting for 40% of the renewable energy used. The Department of Alternative Energy and E�ciency (DEDE), Ministry of Thailand, has been promoting the use of renewable energy in Thailand for the past decade. The new target for renewable energy usage in the country is set at 25% of the �nal energy demand in 2021. Thailand is the world’s fourth largest producer of cassava and this results in the production of signi�cant amounts of cassava rhizome which is a waste product. Cassava rhizome has the potential to be co-�red with coal for the production of heat and power. With suitable co-�ring ratios, little modi�cation will be required in the co-�ring technology. This review article is concerned with an investigation of the feasibility of co-�ring cassava rhizome in a combined heat and power system for a cassava based bio-ethanol plant in Thailand. Enhanced use of cassava rhizome for heat and power production could potentially contribute to a reduction of greenhouse gas emissions and costs, and would help the country to meet the 2021 renewable energy target.
Resumo:
Scene classification based on latent Dirichlet allocation (LDA) is a more general modeling method known as a bag of visual words, in which the construction of a visual vocabulary is a crucial quantization process to ensure success of the classification. A framework is developed using the following new aspects: Gaussian mixture clustering for the quantization process, the use of an integrated visual vocabulary (IVV), which is built as the union of all centroids obtained from the separate quantization process of each class, and the usage of some features, including edge orientation histogram, CIELab color moments, and gray-level co-occurrence matrix (GLCM). The experiments are conducted on IKONOS images with six semantic classes (tree, grassland, residential, commercial/industrial, road, and water). The results show that the use of an IVV increases the overall accuracy (OA) by 11 to 12% and 6% when it is implemented on the selected and all features, respectively. The selected features of CIELab color moments and GLCM provide a better OA than the implementation over CIELab color moment or GLCM as individuals. The latter increases the OA by only ∼2 to 3%. Moreover, the results show that the OA of LDA outperforms the OA of C4.5 and naive Bayes tree by ∼20%. © 2014 Society of Photo-Optical Instrumentation Engineers (SPIE) [DOI: 10.1117/1.JRS.8.083690]
Resumo:
Current European Union regulatory risk assessment allows application of pesticides provided that recovery of nontarget arthropods in-crop occurs within a year. Despite the long-established theory of source-sink dynamics, risk assessment ignores depletion of surrounding populations and typical field trials are restricted to plot-scale experiments. In the present study, the authors used agent-based modeling of 2 contrasting invertebrates, a spider and a beetle, to assess how the area of pesticide application and environmental half-life affect the assessment of recovery at the plot scale and impact the population at the landscape scale. Small-scale plot experiments were simulated for pesticides with different application rates and environmental half-lives. The same pesticides were then evaluated at the landscape scale (10 km × 10 km) assuming continuous year-on-year usage. The authors' results show that recovery time estimated from plot experiments is a poor indicator of long-term population impact at the landscape level and that the spatial scale of pesticide application strongly determines population-level impact. This raises serious doubts as to the utility of plot-recovery experiments in pesticide regulatory risk assessment for population-level protection. Predictions from the model are supported by empirical evidence from a series of studies carried out in the decade starting in 1988. The issues raised then can now be addressed using simulation. Prediction of impacts at landscape scales should be more widely used in assessing the risks posed by environmental stressors.
Resumo:
The automatic transformation of sequential programs for efficient execution on parallel computers involves a number of analyses and restructurings of the input. Some of these analyses are based on computing array sections, a compact description of a range of array elements. Array sections describe the set of array elements that are either read or written by program statements. These sections can be compactly represented using shape descriptors such as regular sections, simple sections, or generalized convex regions. However, binary operations such as Union performed on these representations do not satisfy a straightforward closure property, e.g., if the operands to Union are convex, the result may be nonconvex. Approximations are resorted to in order to satisfy this closure property. These approximations introduce imprecision in the analyses and, furthermore, the imprecisions resulting from successive operations have a cumulative effect. Delayed merging is a technique suggested and used in some of the existing analyses to minimize the effects of approximation. However, this technique does not guarantee an exact solution in a general setting. This article presents a generalized technique to precisely compute Union which can overcome these imprecisions.
Resumo:
An evidence-based review of the potential impact that the introduction of genetically-modified (GM) cereal and oilseed crops could have for the UK was carried out. The inter-disciplinary research project addressed the key research questions using scenarios for the uptake, or not, of GM technologies. This was followed by an extensive literature review, stakeholder consultation and financial modelling. The world area of canola, oilseed rape (OSR) low in both erucic acid in the oil and glucosinolates in the meal, was 34M ha in 2012 of which 27% was GM; Canada is the lead producer but it is also grown in the USA, Australia and Chile. Farm level effects of adopting GM OSR include: lower production costs; higher yields and profits; and ease of farm management. Growing GM OSR instead of conventional OSR reduces both herbicide usage and environmental impact. Some 170M ha of maize was grown in the world in 2011 of which 28% was GM; the main producers are the USA, China and Brazil. Spain is the main EU producer of GM maize although it is also grown widely in Portugal. Insect resistant (IR) and herbicide tolerant (HT) are the GM maize traits currently available commercially. Farm level benefits of adopting GM maize are lower costs of production through reduced use of pesticides and higher profits. GM maize adoption results in less pesticide usage than on conventional counterpart crops leading to less residues in food and animal feed and allowing increasing diversity of bees and other pollinators. In the EU, well-tried coexistence measures for growing GM crops in the proximity of conventional crops have avoided gene flow issues. Scientific evidence so far seems to indicate that there has been no environmental damage from growing GM crops. They may possibly even be beneficial to the environment as they result in less pesticides and herbicides being applied and improved carbon sequestration from less tillage. A review of work on GM cereals relevant for the UK found input trait work on: herbicide and pathogen tolerance; abiotic stress such as from drought or salinity; and yield traits under different field conditions. For output traits, work has mainly focussed on modifying the nutritional components of cereals and in connection with various enzymes, diagnostics and vaccines. Scrutiny of applications submitted for field trial testing of GM cereals found around 9000 applications in the USA, 15 in Australia and 10 in the EU since 1996. There have also been many patent applications and granted patents for GM cereals in the USA for both input and output traits;an indication of the scale of such work is the fact that in a 6 week period in the spring of 2013, 12 patents were granted relating to GM cereals. A dynamic financial model has enabled us to better understand and examine the likely performance of Bt maize and HT OSR for the south of the UK, if cultivation is permitted in the future. It was found that for continuous growing of Bt maize and HT OSR, unless there was pest pressure for the former and weed pressure for the latter, the seed premia and likely coexistence costs for a buffer zone between other crops would reduce the financial returns for the GM crops compared with their conventional counterparts. When modelling HT OSR in a four crop rotation, it was found that gross margins increased significantly at the higher levels of such pest or weed pressure, particularly for farm businesses with larger fields where coexistence costs would be scaled down. The impact of the supply of UK-produced GM crops on the wider supply chain was examined through an extensive literature review and widespread stakeholder consultation with the feed supply chain. The animal feed sector would benefit from cheaper supplies of raw materials if GM crops were grown and, in the future, they might also benefit from crops with enhanced nutritional profile (such as having higher protein levels) becoming available. This would also be beneficial to livestock producers enabling lower production costs and higher margins. Whilst coexistence measures would result in increased costs, it is unlikely that these would cause substantial changes in the feed chain structure. Retailers were not concerned about a future increase in the amount of animal feed coming from GM crops. To conclude, we (the project team) feel that the adoption of currently available and appropriate GM crops in the UK in the years ahead would benefit farmers, consumers and the feed chain without causing environmental damage. Furthermore, unless British farmers are allowed to grow GM crops in the future, the competitiveness of farming in the UK is likely to decline relative to that globally.
Resumo:
Wireless Sensor Networks (WSNs) have been an exciting topic in recent years. The services offered by a WSN can be classified into three major categories: monitoring, alerting, and information on demand. WSNs have been used for a variety of applications related to the environment (agriculture, water and forest fire detection), the military, buildings, health (elderly people and home monitoring), disaster relief, and area or industrial monitoring. In most WSNs tasks like processing the sensed data, making decisions and generating emergency messages are carried out by a remote server, hence the need for efficient means of transferring data across the network. Because of the range of applications and types of WSN there is a need for different kinds of MAC and routing protocols in order to guarantee delivery of data from the source nodes to the server (or sink). In order to minimize energy consumption and increase performance in areas such as reliability of data delivery, extensive research has been conducted and documented in the literature on designing energy efficient protocols for each individual layer. The most common way to conserve energy in WSNs involves using the MAC layer to put the transceiver and the processor of the sensor node into a low power, sleep state when they are not being used. Hence the energy wasted due to collisions, overhearing and idle listening is reduced. As a result of this strategy for saving energy, the routing protocols need new solutions that take into account the sleep state of some nodes, and which also enable the lifetime of the entire network to be increased by distributing energy usage between nodes over time. This could mean that a combined MAC and routing protocol could significantly improve WSNs because the interaction between the MAC and network layers lets nodes be active at the same time in order to deal with data transmission. In the research presented in this thesis, a cross-layer protocol based on MAC and routing protocols was designed in order to improve the capability of WSNs for a range of different applications. Simulation results, based on a range of realistic scenarios, show that these new protocols improve WSNs by reducing their energy consumption as well as enabling them to support mobile nodes, where necessary. A number of conference and journal papers have been published to disseminate these results for a range of applications.
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The main objective for this degree project is to implement an Application Availability Monitoring (AAM) system named Softek EnView for Fujitsu Services. The aim of implementing the AAM system is to proactively identify end user performance problems, such as application and site performance, before the actual end users experience them. No matter how well applications and sites are designed and nomatter how well they meet business requirements, they are useless to the end users if the performance is slow and/or unreliable. It is important for the customers to find out whether the end user problems are caused by the network or application malfunction. The Softek EnView was comprised of the following EnView components: Robot, Monitor, Reporter, Collector and Repository. The implemented system, however, is designed to use only some of these EnView elements: Robot, Reporter and depository. Robots can be placed at any key user location and are dedicated to customers, which means that when the number of customers increases, at the sametime the amount of Robots will increase. To make the AAM system ideal for the company to use, it was integrated with Fujitsu Services’ centralised monitoring system, BMC PATROL Enterprise Manager (PEM). That was actually the reason for deciding to drop the EnView Monitor element. After the system was fully implemented, the AAM system was ready for production. Transactions were (and are) written and deployed on Robots to simulate typical end user actions. These transactions are configured to run with certain intervals, which are defined collectively with customers. While they are driven against customers’ applicationsautomatically, transactions collect availability data and response time data all the time. In case of a failure in transactions, the robot immediately quits the transactionand writes detailed information to a log file about what went wrong and which element failed while going through an application. Then an alert is generated by a BMC PATROL Agent based on this data and is sent to the BMC PEM. Fujitsu Services’ monitoring room receives the alert, reacts to it according to the incident management process in ITIL and by alerting system specialists on critical incidents to resolve problems. As a result of the data gathered by the Robots, weekly reports, which contain detailed statistics and trend analyses of ongoing quality of IT services, is provided for the Customers.
Resumo:
Text messaging is a new form of writing, brought about by technological development in the last couple of decades. Mobile phone usage has increased rapidly worldwide and texting is now part of many people's everyday communcation. A large number of users send or receive texts which include some abbreviations and shortenings, commonly referred to as textspeak. This novel linguistic phenomenon is perceived by some with indifference and by others with aggravation. The following study examines attitudes towards this linguistic change from a gender and age perspective. The comparison between two groups show that the most conservative and least positive to change are young women. The analysis and discussion around this focuses on power, prestige and patterns.
Resumo:
The focus of this thesis is to discuss the development and modeling of an interface architecture to be employed for interfacing analog signals in mixed-signal SOC. We claim that the approach that is going to be presented is able to achieve wide frequency range, and covers a large range of applications with constant performance, allied to digital configuration compatibility. Our primary assumptions are to use a fixed analog block and to promote application configurability in the digital domain, which leads to a mixed-signal interface. The use of a fixed analog block avoids the performance loss common to configurable analog blocks. The usage of configurability on the digital domain makes possible the use of all existing tools for high level design, simulation and synthesis to implement the target application, with very good performance prediction. The proposed approach utilizes the concept of frequency translation (mixing) of the input signal followed by its conversion to the ΣΔ domain, which makes possible the use of a fairly constant analog block, and also, a uniform treatment of input signal from DC to high frequencies. The programmability is performed in the ΣΔ digital domain where performance can be closely achieved according to application specification. The interface performance theoretical and simulation model are developed for design space exploration and for physical design support. Two prototypes are built and characterized to validate the proposed model and to implement some application examples. The usage of this interface as a multi-band parametric ADC and as a two channels analog multiplier and adder are shown. The multi-channel analog interface architecture is also presented. The characterization measurements support the main advantages of the approach proposed.