788 resultados para statistical modelling, wind effects, signal propagation, wireless sensor networks


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O&Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O&Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Buildings and other infrastructures located in the coastal regions of the US have a higher level of wind vulnerability. Reducing the increasing property losses and causalities associated with severe windstorms has been the central research focus of the wind engineering community. The present wind engineering toolbox consists of building codes and standards, laboratory experiments, and field measurements. The American Society of Civil Engineers (ASCE) 7 standard provides wind loads only for buildings with common shapes. For complex cases it refers to physical modeling. Although this option can be economically viable for large projects, it is not cost-effective for low-rise residential houses. To circumvent these limitations, a numerical approach based on the techniques of Computational Fluid Dynamics (CFD) has been developed. The recent advance in computing technology and significant developments in turbulence modeling is making numerical evaluation of wind effects a more affordable approach. The present study targeted those cases that are not addressed by the standards. These include wind loads on complex roofs for low-rise buildings, aerodynamics of tall buildings, and effects of complex surrounding buildings. Among all the turbulence models investigated, the large eddy simulation (LES) model performed the best in predicting wind loads. The application of a spatially evolving time-dependent wind velocity field with the relevant turbulence structures at the inlet boundaries was found to be essential. All the results were compared and validated with experimental data. The study also revealed CFD’s unique flow visualization and aerodynamic data generation capabilities along with a better understanding of the complex three-dimensional aerodynamics of wind-structure interactions. With the proper modeling that realistically represents the actual turbulent atmospheric boundary layer flow, CFD can offer an economical alternative to the existing wind engineering tools. CFD’s easy accessibility is expected to transform the practice of structural design for wind, resulting in more wind-resilient and sustainable systems by encouraging optimal aerodynamic and sustainable structural/building design. Thus, this method will help ensure public safety and reduce economic losses due to wind perils.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Allostery is a phenomenon of fundamental importance in biology, allowing regulation of function and dynamic adaptability of enzymes and proteins. Despite the allosteric effect was first observed more than a century ago allostery remains a biophysical enigma, defined as the “second secret of life”. The challenge is mainly associated to the rather complex nature of the allosteric mechanisms, which manifests itself as the alteration of the biological function of a protein/enzyme (e.g. ligand/substrate binding at the active site) by binding of “other object” (“allos stereos” in Greek) at a site distant (> 1 nanometer) from the active site, namely the effector site. Thus, at the heart of allostery there is signal propagation from the effector to the active site through a dense protein matrix, with a fundamental challenge being represented by the elucidation of the physico-chemical interactions between amino acid residues allowing communicatio n between the two binding sites, i.e. the “allosteric pathways”. Here, we propose a multidisciplinary approach based on a combination of computational chemistry, involving molecular dynamics simulations of protein motions, (bio)physical analysis of allosteric systems, including multiple sequence alignments of known allosteric systems, and mathematical tools based on graph theory and machine learning that can greatly help understanding the complexity of dynamical interactions involved in the different allosteric systems. The project aims at developing robust and fast tools to identify unknown allosteric pathways. The characterization and predictions of such allosteric spots could elucidate and fully exploit the power of allosteric modulation in enzymes and DNA-protein complexes, with great potential applications in enzyme engineering and drug discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using Computational Wind Engineering, CWE, for solving wind-related problems is still a challenging task today, mainly due to the high computational cost required to obtain trustworthy simulations. In particular, the Large Eddy Simulation, LES, has been widely used for evaluating wind loads on buildings. The present thesis assesses the capability of LES as a design tool for wind loading predictions through three cases. The first case is using LES for simulating the wind field around a ground-mounted rectangular prism in Atmospheric Boundary Layer (ABL) flow. The numerical results are validated with experimental results for seven wind attack angles, giving a global understanding of the model performance. The case with the worst model behaviour is investigated, including the spatial distribution of the pressure coefficients and their discrepancies with respect to experimental results. The effects of some numerical parameters are investigated for this case to understand their effectiveness in modifying the obtained numerical results. The second case is using LES for investigating the wind effects on a real high-rise building, aiming at validating the performance of LES as a design tool in practical applications. The numerical results are validated with the experimental results in terms of the distribution of the pressure statistics and the global forces. The mesh sensitivity and the computational cost are discussed. The third case is using LES for studying the wind effects on the new large-span roof over the Bologna stadium. The dynamic responses are analyzed and design envelopes for the structure are obtained. Although it is a numerical simulation before the traditional wind tunnel tests, i.e. the validation of the numerical results are not performed, the preliminary evaluations can effectively inform later investigations and provide the final design processes with deeper confidence regarding the absence of potentially unexpected behaviours.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last couple of decades, many methods for synchronizing chaotic systems have been proposed with communications applications in view. Yet their performance has proved disappointing in face of the nonideal character of usual channels linking transmitter and receiver, that is, due to both noise and signal propagation distortion. Here we consider a discrete-time master-slave system that synchronizes despite channel bandwidth limitations and an allied communication system. Synchronization is achieved introducing a digital filter that limits the spectral content of the feedback loop responsible for producing the transmitted signal. Copyright (C) 2009 Marcio Eisencraft et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Community Climate Model (CCM3) from the National Center for Atmospheric Research (NCAR) is used to investigate the effect of the South Atlantic sea surface temperature (SST) anomalies on interannual to decadal variability of South American precipitation. Two ensembles composed of multidecadal simulations forced with monthly SST data from the Hadley Centre for the period 1949 to 2001 are analysed. A statistical treatment based on signal-to-noise ratio and Empirical Orthogonal Functions (EOF) is applied to the ensembles in order to reduce the internal variability among the integrations. The ensemble treatment shows a spatial and temporal dependence of reproducibility. High degree of reproducibility is found in the tropics while the extratropics is apparently less reproducible. Austral autumn (MAM) and spring (SON) precipitation appears to be more reproducible over the South America-South Atlantic region than the summer (DJF) and winter (JJA) rainfall. While the Inter-tropical Convergence Zone (ITCZ) region is dominated by external variance, the South Atlantic Convergence Zone (SACZ) over South America is predominantly determined by internal variance, which makes it a difficult phenomenon to predict. Alternatively, the SACZ over western South Atlantic appears to be more sensitive to the subtropical SST anomalies than over the continent. An attempt is made to separate the atmospheric response forced by the South Atlantic SST anomalies from that associated with the El Nino - Southern Oscillation (ENSO). Results show that both the South Atlantic and Pacific SSTs modulate the intensity and position of the SACZ during DJF. Particularly, the subtropical South Atlantic SSTs are more important than ENSO in determining the position of the SACZ over the southeast Brazilian coast during DJF. On the other hand, the ENSO signal seems to influence the intensity of the SACZ not only in DJF but especially its oceanic branch during MAM. Both local and remote influences, however, are confounded by the large internal variance in the region. During MAM and JJA, the South Atlantic SST anomalies affect the magnitude and the meridional displacement of the ITCZ. In JJA, the ENSO has relatively little influence on the interannual variability of the simulated rainfall. During SON, however, the ENSO seems to counteract the effect of the subtropical South Atlantic SST variations on convection over South America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the problem of distributed estimation based on the affine projection algorithm (APA), which is developed from Newton`s method for minimizing a cost function. The proposed solution is formulated to ameliorate the limited convergence properties of least-mean-square (LMS) type distributed adaptive filters with colored inputs. The analysis of transient and steady-state performances at each individual node within the network is developed by using a weighted spatial-temporal energy conservation relation and confirmed by computer simulations. The simulation results also verify that the proposed algorithm provides not only a faster convergence rate but also an improved steady-state performance as compared to an LMS-based scheme. In addition, the new approach attains an acceptable misadjustment performance with lower computational and memory cost, provided the number of regressor vectors and filter length parameters are appropriately chosen, as compared to a distributed recursive-least-squares (RLS) based method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To compare the population modelling programs NONMEM and P-PHARM during investigation of the pharmacokinetics of tacrolimus in paediatric liver-transplant recipients. Methods: Population pharmacokinetic analysis was performed using NONMEM and P-PHARM on retrospective data from 35 paediatric liver-transplant patients receiving tacrolimus therapy. The same data were presented to both programs. Maximum likelihood estimates were sought for apparent clearance (CL/F) and apparent volume of distribution (V/F). Covariates screened for influence on these parameters were weight, age, gender, post-operative day, days of tacrolimus therapy, transplant type, biliary reconstructive procedure, liver function tests, creatinine clearance, haematocrit, corticosteroid dose, and potential interacting drugs. Results: A satisfactory model was developed in both programs with a single categorical covariate - transplant type - providing stable parameter estimates and small, normally distributed (weighted) residuals. In NONMEM, the continuous covariates - age and liver function tests - improved modelling further. Mean parameter estimates were CL/F (whole liver) = 16.3 1/h, CL/F (cut-down liver) = 8.5 1/h and V/F = 565 1 in NONMEM, and CL/F = 8.3 1/h and V/F = 155 1 in P-PHARM. Individual Bayesian parameter estimates were CL/F (whole liver) = 17.9 +/- 8.8 1/h, CL/F (cutdown liver) = 11.6 +/- 18.8 1/h and V/F = 712 792 1 in NONMEM, and CL/F (whole liver) = 12.8 +/- 3.5 1/h, CL/F (cut-down liver) = 8.2 +/- 3.4 1/h and V/F = 221 1641 in P-PHARM. Marked interindividual kinetic variability (38-108%) and residual random error (approximately 3 ng/ml) were observed. P-PHARM was more user friendly and readily provided informative graphical presentation of results. NONMEM allowed a wider choice of errors for statistical modelling and coped better with complex covariate data sets. Conclusion: Results from parametric modelling programs can vary due to different algorithms employed to estimate parameters, alternative methods of covariate analysis and variations and limitations in the software itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents experimental results of the communication performance evaluation of a prototype ZigBee-based patient monitoring system commissioned in an in-patient floor of a Portuguese hospital (HPG – Hospital Privado de Guimar~aes). Besides, it revisits relevant problems that affect the performance of nonbeacon-enabled ZigBee networks. Initially, the presence of hidden-nodes and the impact of sensor node mobility are discussed. It was observed, for instance, that the message delivery ratio in a star network consisting of six wireless electrocardiogram sensor devices may decrease from 100% when no hidden-nodes are present to 83.96% when half of the sensor devices are unable to detect the transmissions made by the other half. An additional aspect which affects the communication reliability is a deadlock condition that can occur if routers are unable to process incoming packets during the backoff part of the CSMA-CA mechanism. A simple approach to increase the message delivery ratio in this case is proposed and its effectiveness is verified. The discussion and results presented in this paper aim to contribute to the design of efficient networks,and are valid to other scenarios and environments rather than hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM: This work presents detailed experimental performance results from tests executed in the hospital environment for Health Monitoring for All (HM4All), a remote vital signs monitoring system based on a ZigBee® (ZigBee Alliance, San Ramon, CA) body sensor network (BSN). MATERIALS AND METHODS: Tests involved the use of six electrocardiogram (ECG) sensors operating in two different modes: the ECG mode involved the transmission of ECG waveform data and heart rate (HR) values to the ZigBee coordinator, whereas the HR mode included only the transmission of HR values. In the absence of hidden nodes, a non-beacon-enabled star network composed of sensing devices working on ECG mode kept the delivery ratio (DR) at 100%. RESULTS: When the network topology was changed to a 2-hop tree, the performance degraded slightly, resulting in an average DR of 98.56%. Although these performance outcomes may seem satisfactory, further investigation demonstrated that individual sensing devices went through transitory periods with low DR. Other tests have shown that ZigBee BSNs are highly susceptible to collisions owing to hidden nodes. Nevertheless, these tests have also shown that these networks can achieve high reliability if the amount of traffic is kept low. Contrary to what is typically shown in scientific articles and in manufacturers' documentation, the test outcomes presented in this article include temporal graphs of the DR achieved by each wireless sensor device. CONCLUSIONS: The test procedure and the approach used to represent its outcomes, which allow the identification of undesirable transitory periods of low reliability due to contention between devices, constitute the main contribution of this work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Video coding technologies have played a major role in the explosion of large market digital video applications and services. In this context, the very popular MPEG-x and H-26x video coding standards adopted a predictive coding paradigm, where complex encoders exploit the data redundancy and irrelevancy to 'control' much simpler decoders. This codec paradigm fits well applications and services such as digital television and video storage where the decoder complexity is critical, but does not match well the requirements of emerging applications such as visual sensor networks where the encoder complexity is more critical. The Slepian Wolf and Wyner-Ziv theorems brought the possibility to develop the so-called Wyner-Ziv video codecs, following a different coding paradigm where it is the task of the decoder, and not anymore of the encoder, to (fully or partly) exploit the video redundancy. Theoretically, Wyner-Ziv video coding does not incur in any compression performance penalty regarding the more traditional predictive coding paradigm (at least for certain conditions). In the context of Wyner-Ziv video codecs, the so-called side information, which is a decoder estimate of the original frame to code, plays a critical role in the overall compression performance. For this reason, much research effort has been invested in the past decade to develop increasingly more efficient side information creation methods. This paper has the main objective to review and evaluate the available side information methods after proposing a classification taxonomy to guide this review, allowing to achieve more solid conclusions and better identify the next relevant research challenges. After classifying the side information creation methods into four classes, notably guess, try, hint and learn, the review of the most important techniques in each class and the evaluation of some of them leads to the important conclusion that the side information creation methods provide better rate-distortion (RD) performance depending on the amount of temporal correlation in each video sequence. It became also clear that the best available Wyner-Ziv video coding solutions are almost systematically based on the learn approach. The best solutions are already able to systematically outperform the H.264/AVC Intra, and also the H.264/AVC zero-motion standard solutions for specific types of content. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The tongue is the most important and dynamic articulator for speech formation, because of its anatomic aspects (particularly, the large volume of this muscular organ comparatively to the surrounding organs of the vocal tract) and also due to the wide range of movements and flexibility that are involved. In speech communication research, a variety of techniques have been used for measuring the three-dimensional vocal tract shapes. More recently, magnetic resonance imaging (MRI) becomes common; mainly, because this technique allows the collection of a set of static and dynamic images that can represent the entire vocal tract along any orientation. Over the years, different anatomical organs of the vocal tract have been modelled; namely, 2D and 3D tongue models, using parametric or statistical modelling procedures. Our aims are to present and describe some 3D reconstructed models from MRI data, for one subject uttering sustained articulations of some typical Portuguese sounds. Thus, we present a 3D database of the tongue obtained by stack combinations with the subject articulating Portuguese vowels. This 3D knowledge of the speech organs could be very important; especially, for clinical purposes (for example, for the assessment of articulatory impairments followed by tongue surgery in speech rehabilitation), and also for a better understanding of acoustic theory in speech formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Future industrial control/multimedia applications will increasingly impose or benefit from wireless and mobile communications. Therefore, there is an enormous eagerness for extending currently available industrial communications networks with wireless and mobility capabilities. The RFieldbus European project is just one example, where a PROFIBUS-based hybrid (wired/wireless) architecture was specified and implemented. In the RFieldbus architecture, interoperability between wired and wireless components is achieved by the use specific intermediate networking systems operating at the physical layer level, i.e. operating as repeaters. Instead, in this paper we will focus on a bridge-based approach, which presents several advantages. This concept was introduced in (Ferreira, et al., 2002), where a bridge-based approach was briefly outlined. Then, a specific Inter-Domain Protocol (IDP) was proposed to handle the Inter-Domain transactions in such a bridge-based approach (Ferreira, et al., 2003a). The major contribution of this paper is in extending these previous works by describing the protocol extensions to support inter-cell mobility in such a bridge-based hybrid wired/wireless PROFIBUS networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The IEEE 802.15.4/ZigBee protocols are gaining increasing interests in both research and industrial communities as candidate technologies for Wireless Sensor Network (WSN) applications. In this paper, we present an open-source implementation of the IEEE 802.15.4/Zigbee protocol stack under the TinyOS operating system for the MICAz motes. This work has been driven by the need for an open-source implementation of the IEEE 802.15.4/ZigBee protocols, filling a gap between some newly released complex C implementations and black-box implementations from different manufacturers. In addition, we share our experience on the challenging problem that we have faced during the implementation of the protocol stack on the MICAz motes. We strongly believe that this open-source implementation will potentiate research works on the IEEE 802.15.4/Zigbee protocols allowing their demonstration and validation through experimentation.