827 resultados para Multiple-scale processing
Resumo:
The article discusses a proposal of displacement measurement using a unique digital camera aiming at to exploit its feasibility for Modal Analysis applications. The proposal discusses a non-contact measuring approach able to measure multiple points simultaneously by using a unique digital camera. A modal analysis of a reduced scale lab building structure based only at the responses of the structure measured with the camera is presented. It focuses at the feasibility of using a simple ordinary camera for performing the output only modal analysis of structures and its advantage. The modal parameters of the structure are estimated from the camera data and also by using ordinary experimental modal analysis based on the Frequency Response Function (FRF) obtained by using the usual sensors like accelerometer and force cell. The comparison of the both analysis showed that the technique is promising noncontact measuring tool relatively simple and effective to be used in structural modal analysis
Resumo:
Many recent survival studies propose modeling data with a cure fraction, i.e., data in which part of the population is not susceptible to the event of interest. This event may occur more than once for the same individual (recurrent event). We then have a scenario of recurrent event data in the presence of a cure fraction, which may appear in various areas such as oncology, finance, industries, among others. This paper proposes a multiple time scale survival model to analyze recurrent events using a cure fraction. The objective is analyzing the efficiency of certain interventions so that the studied event will not happen again in terms of covariates and censoring. All estimates were obtained using a sampling-based approach, which allows information to be input beforehand with lower computational effort. Simulations were done based on a clinical scenario in order to observe some frequentist properties of the estimation procedure in the presence of small and moderate sample sizes. An application of a well-known set of real mammary tumor data is provided.
Resumo:
Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Multiple sclerosis (MS) causes a broad range of neurological symptoms. Most common is poor balance control. However, knowledge of deficient balance control in mildly affected MS patients who are complaining of balance impairment but have normal clinical balance tests (CBT) is limited. This knowledge might provide insights into the normal and pathophysiological mechanisms underlying stance and gait. We analysed differences in trunk sway between mildly disabled MS patients with and without subjective balance impairment (SBI), all with normal CBT. The sway was measured for a battery of stance and gait balance tests (static and dynamic posturography) and compared to that of age- and sex-matched healthy subjects. Eight of 21 patients (38%) with an Expanded Disability Status Scale of 1.0-3.0 complained of SBI during daily activities. For standing on both legs with eyes closed on a normal and on a foam surface, patients in the no SBI group showed significant differences in the range of trunk roll (lateral) sway angle and velocity, compared to normal persons. Patients in the SBI group had significantly greater lateral sway than the no SBI group, and sway was also greater than normal in the pitch (anterior-posterior) direction. Sway for one-legged stance on foam was also greater in the SBI group compared to the no SBI and normal groups. We found a specific laterally directed impairment of balance in all patients, consistent with a deficit in proprioceptive processing, which was greater in the SBI group than in the no SBI group. This finding most likely explains the subjective symptoms of imbalance in patients with MS with normal CBT.
Resumo:
Inexpensive, commercial available off-the-shelf (COTS) Global Positioning Receivers (GPS) have typical accuracy of ±3 meters when augmented by the Wide Areas Augmentation System (WAAS). There exist applications that require position measurements between two moving targets. The focus of this work is to explore the viability of using clusters of COTS GPS receivers for relative position measurements to improve their accuracy. An experimental study was performed using two clusters, each with five GPS receivers, with a fixed distance of 4.5 m between the clusters. Although the relative position was fixed, the entire system of ten GPS receivers was on a mobile platform. Data was recorded while moving the system over a rectangular track with a perimeter distance of 7564 m. The data was post processed and yielded approximately 1 meter accuracy for the relative position vector between the two clusters.
Resumo:
CD4(+) T cells play a central role in the pathogenesis of multiple sclerosis (MS). Generation, activation and effector function of these cells crucially depends on their interaction with MHC II-peptide complexes displayed by antigen presenting cells (APC). Processing and presentation of self antigens by different APC therefore influences the disease course at all stages. Selection by thymic APC leads to the generation of autoreactive T cells, which can be activated by peripheral APC. Reactivation by central nervous system APC leads to the initiation of the inflammatory response resulting in demyelination. In this review we will focus on how MHC class II antigenic epitopes are created by different APC from the thymus, the periphery and from the brain, and will discuss the relevance of the balance between creation and destruction of such epitopes in the context of MS. A solid understanding of these processes offers the possibility for designing future therapeutic strategies.
Resumo:
Animal pollination is essential for the reproductive success of many wild and crop plants. Loss and isolation of (semi-)natural habitats in agricultural landscapes can cause declines of plants and pollinators and endanger pollination services.We investigated the independent effects of these drivers on pollination of young cherry trees in a landscape-scale experiment. We included (i) isolation of study trees from other cherry trees (up to 350 m), (ii) the amount of cherry trees in the landscape, (iii) the isolation from other woody habitats (up to 200 m) and (iv) the amount of woody habitats providing nesting and floral resources for pollinators. At the local scale, we considered effects of (v) cherry flower density and (vi) heterospecific flower density. Pollinators visited flowers more often in landscapes with high amount of woody habitat and at sites with lower isolation from the next cherry tree. Fruit set was reduced by isolation from the next cherry tree and by a high local density of heterospecific flowers but did not directly depend on pollinator visitation. These results reveal the importance of considering the plant’s need for con-specific pollen and its pollen competition with co-flowering species rather than focusing only on pollinators’ habitat requirements and flower visita-tion. It proved to be important to disentangle habitat isolation from habitat loss, local from landscape-scale effects, and direct effects of pollen availability on fruit set from indirect effects via pollinator visitation to understand the delivery of an agriculturally important ecosystem service.
Resumo:
A two-pronged approach for the automatic quantitation of multiple sclerosis (MS) lesions on magnetic resonance (MR) images has been developed. This method includes the design and use of a pulse sequence for improved lesion-to-tissue contrast (LTC) and seeks to identify and minimize the sources of false lesion classifications in segmented images. The new pulse sequence, referred to as AFFIRMATIVE (Attenuation of Fluid by Fast Inversion Recovery with MAgnetization Transfer Imaging with Variable Echoes), improves the LTC, relative to spin-echo images, by combining Fluid-Attenuated Inversion Recovery (FLAIR) and Magnetization Transfer Contrast (MTC). In addition to acquiring fast FLAIR/MTC images, the AFFIRMATIVE sequence simultaneously acquires fast spin-echo (FSE) images for spatial registration of images, which is necessary for accurate lesion quantitation. Flow has been found to be a primary source of false lesion classifications. Therefore, an imaging protocol and reconstruction methods are developed to generate "flow images" which depict both coherent (vascular) and incoherent (CSF) flow. An automatic technique is designed for the removal of extra-meningeal tissues, since these are known to be sources of false lesion classifications. A retrospective, three-dimensional (3D) registration algorithm is implemented to correct for patient movement which may have occurred between AFFIRMATIVE and flow imaging scans. Following application of these pre-processing steps, images are segmented into white matter, gray matter, cerebrospinal fluid, and MS lesions based on AFFIRMATIVE and flow images using an automatic algorithm. All algorithms are seamlessly integrated into a single MR image analysis software package. Lesion quantitation has been performed on images from 15 patient volunteers. The total processing time is less than two hours per patient on a SPARCstation 20. The automated nature of this approach should provide an objective means of monitoring the progression, stabilization, and/or regression of MS lesions in large-scale, multi-center clinical trials. ^
Resumo:
Two genes with related functions in RNA biogenesis were recently reported in patients with familial ALS: the FUS/TLS gene at the ALS6 locus and the TARDBP/TDP-43 gene at the ALS10 locus [1, 2]. FUS has been implicated to function in several steps of gene expression, including transcription regulation [3], RNA splicing [4, 5], mRNA transport in neurons [6] and, interestingly, in microRNA (miRNA) processing [7]. The goal of this project is to identify the molecular mechanisms leading to the development of FUS mutations-associated ALS. Specifically, we want to test the hypothesis that these FUS mutations misregulate miRNA levels that in turn affect the expression of genes critical for motor neuron survival. In addition we want to test whether misregulation of the miRNA profile is a common feature in ALS. We have performed immunoprecipitations from total extracts of 293T cells expressing FLAG-tagged FUS to characterize its interactome by mass spectrometry. This proteomic study not only revealed a strong interaction of FUS with splicing factors, but shows that FUS might be involved in many, quite different pathways. To map which parts of the FUS protein contribute to the interaction with splicing factors, we have performed a set of experiments with a series of missense and deletion mutants. With this approach, we will not only gain information on the binding partners of FUS along with a map of the required domains for the interactions, but it will also help to unravel whether certain ALS-associated FUS mutations lead to a loss or gain of function due to gain or loss of interactors. Additionally, we have performed quantitative interactomics using SILAC to identify interactome differences of ALS-associated FUS mutants. To this end we have performed immunoprecipitations of total extract from 293T cells, stably transduced with constructs expressing wild-type FUS-FLAG as well as three different ALS-associated mutants (G156E, R244C, P525L). First results indicate striking differences in the interactome with certain RNA binding proteins. We are now validating these candidates in order to reveal the importance of these differential interactions in the context of ALS.
Resumo:
To date, big data applications have focused on the store-and-process paradigm. In this paper we describe an initiative to deal with big data applications for continuous streams of events. In many emerging applications, the volume of data being streamed is so large that the traditional ‘store-then-process’ paradigm is either not suitable or too inefficient. Moreover, soft-real time requirements might severely limit the engineering solutions. Many scenarios fit this description. In network security for cloud data centres, for instance, very high volumes of IP packets and events from sensors at firewalls, network switches and routers and servers need to be analyzed and should detect attacks in minimal time, in order to limit the effect of the malicious activity over the IT infrastructure. Similarly, in the fraud department of a credit card company, payment requests should be processed online and need to be processed as quickly as possible in order to provide meaningful results in real-time. An ideal system would detect fraud during the authorization process that lasts hundreds of milliseconds and deny the payment authorization, minimizing the damage to the user and the credit card company.
Resumo:
The paper analyses whether that a properly designed multiple choice test can discriminate with a high level of accuracy if a student in our context has reached a B2 level according to the CEFRL.
Resumo:
A real-time large scale part-to-part video matching algorithm, based on the cross correlation of the intensity of motion curves, is proposed with a view to originality recognition, video database cleansing, copyright enforcement, video tagging or video result re-ranking. Moreover, it is suggested how the most representative hashes and distance functions - strada, discrete cosine transformation, Marr-Hildreth and radial - should be integrated in order for the matching algorithm to be invariant against blur, compression and rotation distortions: (R; _) 2 [1; 20]_[1; 8], from 512_512 to 32_32pixels2 and from 10 to 180_. The DCT hash is invariant against blur and compression up to 64x64 pixels2. Nevertheless, although its performance against rotation is the best, with a success up to 70%, it should be combined with the Marr-Hildreth distance function. With the latter, the image selected by the DCT hash should be at a distance lower than 1.15 times the Marr-Hildreth minimum distance.