914 resultados para Multiple-scale processing


Relevância:

40.00% 40.00%

Publicador:

Resumo:

1.There is concern over the possibility of unwanted environmental change following transgene movement from genetically modified (GM) rapeseed Brassica napus to its wild and weedy relatives. 2. The aim of this research was to develop a remote sensing-assisted methodology to help quantify gene flow from crops to their wild relatives over wide areas. Emphasis was placed on locating sites of sympatry, where the frequency of gene flow is likely to be highest, and on measuring the size of rapeseed fields to allow spatially explicit modelling of wind-mediated pollen-dispersal patterns. 3. Remote sensing was used as a tool to locate rapeseed fields, and a variety of image-processing techniques was adopted to facilitate the compilation of a spatially explicit profile of sympatry between the crop and Brassica rapa. 4. Classified satellite images containing rapeseed fields were first used to infer the spatial relationship between donor rapeseed fields and recipient riverside B. rapa populations. Such images also have utility for improving the efficiency of ground surveys by identifying probable sites of sympatry. The same data were then also used for the calculation of mean field size. 5. This paper forms a companion paper to Wilkinson et al. (2003), in which these elements were combined to produce a spatially explicit profile of hybrid formation over the UK. The current paper demonstrates the value of remote sensing and image processing for large-scale studies of gene flow, and describes a generic method that could be applied to a variety of crops in many countries. 6.Synthesis and applications. The decision to approve or prevent the release of a GM cultivar is made at a national rather than regional level. It is highly desirable that data relating to the decision-making process are collected at the same scale, rather than relying on extrapolation from smaller experiments designed at the plot, field or even regional scale. It would be extremely difficult and labour intensive to attempt to carry out such large-scale investigations without the use of remote-sensing technology. This study used rapeseed in the UK as a model to demonstrate the value of remote sensing in assembling empirical information at a national level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Anthropogenic emissions of heat and exhaust gases play an important role in the atmospheric boundary layer, altering air quality, greenhouse gas concentrations and the transport of heat and moisture at various scales. This is particularly evident in urban areas where emission sources are integrated in the highly heterogeneous urban canopy layer and directly linked to human activities which exhibit significant temporal variability. It is common practice to use eddy covariance observations to estimate turbulent surface fluxes of latent heat, sensible heat and carbon dioxide, which can be attributed to a local scale source area. This study provides a method to assess the influence of micro-scale anthropogenic emissions on heat, moisture and carbon dioxide exchange in a highly urbanized environment for two sites in central London, UK. A new algorithm for the Identification of Micro-scale Anthropogenic Sources (IMAS) is presented, with two aims. Firstly, IMAS filters out the influence of micro-scale emissions and allows for the analysis of the turbulent fluxes representative of the local scale source area. Secondly, it is used to give a first order estimate of anthropogenic heat flux and carbon dioxide flux representative of the building scale. The algorithm is evaluated using directional and temporal analysis. The algorithm is then used at a second site which was not incorporated in its development. The spatial and temporal local scale patterns, as well as micro-scale fluxes, appear physically reasonable and can be incorporated in the analysis of long-term eddy covariance measurements at the sites in central London. In addition to the new IMAS-technique, further steps in quality control and quality assurance used for the flux processing are presented. The methods and results have implications for urban flux measurements in dense urbanised settings with significant sources of heat and greenhouse gases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The article discusses a proposal of displacement measurement using a unique digital camera aiming at to exploit its feasibility for Modal Analysis applications. The proposal discusses a non-contact measuring approach able to measure multiple points simultaneously by using a unique digital camera. A modal analysis of a reduced scale lab building structure based only at the responses of the structure measured with the camera is presented. It focuses at the feasibility of using a simple ordinary camera for performing the output only modal analysis of structures and its advantage. The modal parameters of the structure are estimated from the camera data and also by using ordinary experimental modal analysis based on the Frequency Response Function (FRF) obtained by using the usual sensors like accelerometer and force cell. The comparison of the both analysis showed that the technique is promising noncontact measuring tool relatively simple and effective to be used in structural modal analysis

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many recent survival studies propose modeling data with a cure fraction, i.e., data in which part of the population is not susceptible to the event of interest. This event may occur more than once for the same individual (recurrent event). We then have a scenario of recurrent event data in the presence of a cure fraction, which may appear in various areas such as oncology, finance, industries, among others. This paper proposes a multiple time scale survival model to analyze recurrent events using a cure fraction. The objective is analyzing the efficiency of certain interventions so that the studied event will not happen again in terms of covariates and censoring. All estimates were obtained using a sampling-based approach, which allows information to be input beforehand with lower computational effort. Simulations were done based on a clinical scenario in order to observe some frequentist properties of the estimation procedure in the presence of small and moderate sample sizes. An application of a well-known set of real mammary tumor data is provided.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multiple sclerosis (MS) causes a broad range of neurological symptoms. Most common is poor balance control. However, knowledge of deficient balance control in mildly affected MS patients who are complaining of balance impairment but have normal clinical balance tests (CBT) is limited. This knowledge might provide insights into the normal and pathophysiological mechanisms underlying stance and gait. We analysed differences in trunk sway between mildly disabled MS patients with and without subjective balance impairment (SBI), all with normal CBT. The sway was measured for a battery of stance and gait balance tests (static and dynamic posturography) and compared to that of age- and sex-matched healthy subjects. Eight of 21 patients (38%) with an Expanded Disability Status Scale of 1.0-3.0 complained of SBI during daily activities. For standing on both legs with eyes closed on a normal and on a foam surface, patients in the no SBI group showed significant differences in the range of trunk roll (lateral) sway angle and velocity, compared to normal persons. Patients in the SBI group had significantly greater lateral sway than the no SBI group, and sway was also greater than normal in the pitch (anterior-posterior) direction. Sway for one-legged stance on foam was also greater in the SBI group compared to the no SBI and normal groups. We found a specific laterally directed impairment of balance in all patients, consistent with a deficit in proprioceptive processing, which was greater in the SBI group than in the no SBI group. This finding most likely explains the subjective symptoms of imbalance in patients with MS with normal CBT.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Inexpensive, commercial available off-the-shelf (COTS) Global Positioning Receivers (GPS) have typical accuracy of ±3 meters when augmented by the Wide Areas Augmentation System (WAAS). There exist applications that require position measurements between two moving targets. The focus of this work is to explore the viability of using clusters of COTS GPS receivers for relative position measurements to improve their accuracy. An experimental study was performed using two clusters, each with five GPS receivers, with a fixed distance of 4.5 m between the clusters. Although the relative position was fixed, the entire system of ten GPS receivers was on a mobile platform. Data was recorded while moving the system over a rectangular track with a perimeter distance of 7564 m. The data was post processed and yielded approximately 1 meter accuracy for the relative position vector between the two clusters.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

CD4(+) T cells play a central role in the pathogenesis of multiple sclerosis (MS). Generation, activation and effector function of these cells crucially depends on their interaction with MHC II-peptide complexes displayed by antigen presenting cells (APC). Processing and presentation of self antigens by different APC therefore influences the disease course at all stages. Selection by thymic APC leads to the generation of autoreactive T cells, which can be activated by peripheral APC. Reactivation by central nervous system APC leads to the initiation of the inflammatory response resulting in demyelination. In this review we will focus on how MHC class II antigenic epitopes are created by different APC from the thymus, the periphery and from the brain, and will discuss the relevance of the balance between creation and destruction of such epitopes in the context of MS. A solid understanding of these processes offers the possibility for designing future therapeutic strategies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Animal pollination is essential for the reproductive success of many wild and crop plants. Loss and isolation of (semi-)natural habitats in agricultural landscapes can cause declines of plants and pollinators and endanger pollination services.We investigated the independent effects of these drivers on pollination of young cherry trees in a landscape-scale experiment. We included (i) isolation of study trees from other cherry trees (up to 350 m), (ii) the amount of cherry trees in the landscape, (iii) the isolation from other woody habitats (up to 200 m) and (iv) the amount of woody habitats providing nesting and floral resources for pollinators. At the local scale, we considered effects of (v) cherry flower density and (vi) heterospecific flower density. Pollinators visited flowers more often in landscapes with high amount of woody habitat and at sites with lower isolation from the next cherry tree. Fruit set was reduced by isolation from the next cherry tree and by a high local density of heterospecific flowers but did not directly depend on pollinator visitation. These results reveal the importance of considering the plant’s need for con-specific pollen and its pollen competition with co-flowering species rather than focusing only on pollinators’ habitat requirements and flower visita-tion. It proved to be important to disentangle habitat isolation from habitat loss, local from landscape-scale effects, and direct effects of pollen availability on fruit set from indirect effects via pollinator visitation to understand the delivery of an agriculturally important ecosystem service.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A two-pronged approach for the automatic quantitation of multiple sclerosis (MS) lesions on magnetic resonance (MR) images has been developed. This method includes the design and use of a pulse sequence for improved lesion-to-tissue contrast (LTC) and seeks to identify and minimize the sources of false lesion classifications in segmented images. The new pulse sequence, referred to as AFFIRMATIVE (Attenuation of Fluid by Fast Inversion Recovery with MAgnetization Transfer Imaging with Variable Echoes), improves the LTC, relative to spin-echo images, by combining Fluid-Attenuated Inversion Recovery (FLAIR) and Magnetization Transfer Contrast (MTC). In addition to acquiring fast FLAIR/MTC images, the AFFIRMATIVE sequence simultaneously acquires fast spin-echo (FSE) images for spatial registration of images, which is necessary for accurate lesion quantitation. Flow has been found to be a primary source of false lesion classifications. Therefore, an imaging protocol and reconstruction methods are developed to generate "flow images" which depict both coherent (vascular) and incoherent (CSF) flow. An automatic technique is designed for the removal of extra-meningeal tissues, since these are known to be sources of false lesion classifications. A retrospective, three-dimensional (3D) registration algorithm is implemented to correct for patient movement which may have occurred between AFFIRMATIVE and flow imaging scans. Following application of these pre-processing steps, images are segmented into white matter, gray matter, cerebrospinal fluid, and MS lesions based on AFFIRMATIVE and flow images using an automatic algorithm. All algorithms are seamlessly integrated into a single MR image analysis software package. Lesion quantitation has been performed on images from 15 patient volunteers. The total processing time is less than two hours per patient on a SPARCstation 20. The automated nature of this approach should provide an objective means of monitoring the progression, stabilization, and/or regression of MS lesions in large-scale, multi-center clinical trials. ^