938 resultados para multi-path and multi-link communications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the design of a modular multi-finger haptic device for virtual object manipulation. Mechanical structures are based on one module per finger and can be scaled up to three fingers. Mechanical configurations for two and three fingers are based on the use of one and two redundant axes, respectively. As demonstrated, redundant axes significantly increase workspace and prevent link collisions, which is their main asset with respect to other multi-finger haptic devices. The location of redundant axes and link dimensions have been optimized in order to guarantee a proper workspace, manipulability, force capability, and inertia for the device. The mechanical haptic device design and a thimble adaptable to different finger sizes have also been developed for virtual object manipulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud-aerosol interaction is a key issue in the climate system, affecting the water cycle, the weather, and the total energy balance including the spatial and temporal distribution of latent heat release. Information on the vertical distribution of cloud droplet microphysics and thermodynamic phase as a function of temperature or height, can be correlated with details of the aerosol field to provide insight on how these particles are affecting cloud properties and their consequences to cloud lifetime, precipitation, water cycle, and general energy balance. Unfortunately, today's experimental methods still lack the observational tools that can characterize the true evolution of the cloud microphysical, spatial and temporal structure in the cloud droplet scale, and then link these characteristics to environmental factors and properties of the cloud condensation nuclei. Here we propose and demonstrate a new experimental approach (the cloud scanner instrument) that provides the microphysical information missed in current experiments and remote sensing options. Cloud scanner measurements can be performed from aircraft, ground, or satellite by scanning the side of the clouds from the base to the top, providing us with the unique opportunity of obtaining snapshots of the cloud droplet microphysical and thermodynamic states as a function of height and brightness temperature in clouds at several development stages. The brightness temperature profile of the cloud side can be directly associated with the thermodynamic phase of the droplets to provide information on the glaciation temperature as a function of different ambient conditions, aerosol concentration, and type. An aircraft prototype of the cloud scanner was built and flew in a field campaign in Brazil. The CLAIM-3D (3-Dimensional Cloud Aerosol Interaction Mission) satellite concept proposed here combines several techniques to simultaneously measure the vertical profile of cloud microphysics, thermodynamic phase, brightness temperature, and aerosol amount and type in the neighborhood of the clouds. The wide wavelength range, and the use of multi-angle polarization measurements proposed for this mission allow us to estimate the availability and characteristics of aerosol particles acting as cloud condensation nuclei, and their effects on the cloud microphysical structure. These results can provide unprecedented details on the response of cloud droplet microphysics to natural and anthropogenic aerosols in the size scale where the interaction really happens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work is to describe the design and the implementation of an experiment to study the dynamics and the active control of a slewing multi-link flexible structure. The experimental apparatus was designed to be representative of a flexible space structure such as a satellite with multiple flexible appendages. In this study we describe the design procedures, the analog and digital instrumentation, the analytical modeling together with model validation studies carried out through experimental modal testing and parametric system identification studies in the frequency domain. Preliminary results of a simple positional control where the sensor and the actuator are positioned physically at the same point is also described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse de doctorat effectuée en cotutelle au Département d’administration de la santé Faculté de médecine, Université de Montréal et à l’École doctorale Biologie-Santé Faculté de médecine, Université de Nantes, France

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nucleolin is a multi-functional protein that is located to the nucleolus. In tissue Culture cells, the stability of nucleolin is related to the proliferation status of the cell. During development, rat cardiomyocytes proliferate actively with increases in the mass of the heart being due to both hyperplasia and hypertrophy. The timing of this shift in the phenotype of the myocyte from one capable of undergoing hyperplasia to one that can grow only by hypertrophy occurs within 4 days of post-natal development. Thus, cardiomyocytes are an ideal model system in which to study the regulation of nucleolin during growth in vivo. Using Western blot and quantitative RT-PCR (TaqMan) we found that the amount of nucleolin is regulated both at the level of transcription and translation during the development of the cardiomyocyte. However, in cells which had exited the cell cycle and were subsequently given a hypertrophic stimulus, nucleolin was regulated post-transcriptionally. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the effect of mechanical cycling and cementation strategies on the push-out bond strength between fiber posts and root dentin and the polymerization stresses produced using three resin cements. Materials and Methods: Eighty bovine mandibular teeth were sectioned to a length of 16 mm, prepared to 12 mm, and embedded in self-curing acrylic resin. The specimens were then distributed into 8 groups (n = 10): Gr1 - Scotchbond Multi Purpose + RelyX ARC; Gr2 - Scotchbond Multi Purpose + RelyX ARC + mechanical cycling; Gr3 - AdheSE + Multilink Automix; Gr4 - AdheSE + Multilink Automix + mechanical cycling; Gr5 - phosphoric acid + RelyX U100 (self-adhesive cement); Gr6 - phosphoric acid+ RelyX U100 + mechanical cycling; Gr7 - RelyX U100; Gr8 - RelyX U100 + mechanical cycling. The values obtained from the push-out bond strength test were submitted to two-way ANOVA and Tukey's test (p = 0.05), while the values obtained from the polymerization stress test were subjected to one-way ANOVA and Tukey's test (alpha = 0.05). Results: Mechanical cycling did not affect the bond strength values (p = 0.236), while cementation strategies affected the push-out bond strength (p < 0.001). Luting with RelyX U100 and Scotch Bond Multi Purpose + RelyX ARC yielded higher push-out bond strength values. The polymerization stress results were affected by the factor "cement" (p = 0.0104): the self-adhesive cement RelyX U100 exhibited the lowest values, RelyX ARC resulted in the highest values, while Multi link Automix presented values statistically similar to the other two cements. Conclusion: The self-adhesive cement appears to be a good alternative for luting fiber posts due to the high push-out bond strengths and lower polymerization stress values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sedimentary sequences in ancient or long-lived lakes can reach several thousands of meters in thickness and often provide an unrivalled perspective of the lake's regional climatic, environmental, and biological history. Over the last few years, deep-drilling projects in ancient lakes became increasingly multi- and interdisciplinary, as, among others, seismological, sedimentological, biogeochemical, climatic, environmental, paleontological, and evolutionary information can be obtained from sediment cores. However, these multi- and interdisciplinary projects pose several challenges. The scientists involved typically approach problems from different scientific perspectives and backgrounds, and setting up the program requires clear communication and the alignment of interests. One of the most challenging tasks, besides the actual drilling operation, is to link diverse datasets with varying resolution, data quality, and age uncertainties to answer interdisciplinary questions synthetically and coherently. These problems are especially relevant when secondary data, i.e., datasets obtained independently of the drilling operation, are incorporated in analyses. Nonetheless, the inclusion of secondary information, such as isotopic data from fossils found in outcrops or genetic data from extant species, may help to achieve synthetic answers. Recent technological and methodological advances in paleolimnology are likely to increase the possibilities of integrating secondary information. Some of the new approaches have started to revolutionize scientific drilling in ancient lakes, but at the same time, they also add a new layer of complexity to the generation and analysis of sediment-core data. The enhanced opportunities presented by new scientific approaches to study the paleolimnological history of these lakes, therefore, come at the expense of higher logistic, communication, and analytical efforts. Here we review types of data that can be obtained in ancient lake drilling projects and the analytical approaches that can be applied to empirically and statistically link diverse datasets to create an integrative perspective on geological and biological data. In doing so, we highlight strengths and potential weaknesses of new methods and analyses, and provide recommendations for future interdisciplinary deep-drilling projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The proliferation of wireless sensor networks and the variety of envisioned applications associated with them has motivated the development of distributed algorithms for collaborative processing over networked systems. One of the applications that has attracted the attention of the researchers is that of target localization where the nodes of the network try to estimate the position of an unknown target that lies within its coverage area. Particularly challenging is the problem of estimating the target’s position when we use received signal strength indicator (RSSI) due to the nonlinear relationship between the measured signal and the true position of the target. Many of the existing approaches suffer either from high computational complexity (e.g., particle filters) or lack of accuracy. Further, many of the proposed solutions are centralized which make their application to a sensor network questionable. Depending on the application at hand and, from a practical perspective it could be convenient to find a balance between localization accuracy and complexity. Into this direction we approach the maximum likelihood location estimation problem by solving a suboptimal (and more tractable) problem. One of the main advantages of the proposed scheme is that it allows for a decentralized implementation using distributed processing tools (e.g., consensus and convex optimization) and therefore, it is very suitable to be implemented in real sensor networks. If further accuracy is needed an additional refinement step could be performed around the found solution. Under the assumption of independent noise among the nodes such local search can be done in a fully distributed way using a distributed version of the Gauss-Newton method based on consensus. Regardless of the underlying application or function of the sensor network it is al¬ways necessary to have a mechanism for data reporting. While some approaches use a special kind of nodes (called sink nodes) for data harvesting and forwarding to the outside world, there are however some scenarios where such an approach is impractical or even impossible to deploy. Further, such sink nodes become a bottleneck in terms of traffic flow and power consumption. To overcome these issues instead of using sink nodes for data reporting one could use collaborative beamforming techniques to forward directly the generated data to a base station or gateway to the outside world. In a dis-tributed environment like a sensor network nodes cooperate in order to form a virtual antenna array that can exploit the benefits of multi-antenna communications. In col-laborative beamforming nodes synchronize their phases in order to add constructively at the receiver. Some of the inconveniences associated with collaborative beamforming techniques is that there is no control over the radiation pattern since it is treated as a random quantity. This may cause interference to other coexisting systems and fast bat-tery depletion at the nodes. Since energy-efficiency is a major design issue we consider the development of a distributed collaborative beamforming scheme that maximizes the network lifetime while meeting some quality of service (QoS) requirement at the re¬ceiver side. Using local information about battery status and channel conditions we find distributed algorithms that converge to the optimal centralized beamformer. While in the first part we consider only battery depletion due to communications beamforming, we extend the model to account for more realistic scenarios by the introduction of an additional random energy consumption. It is shown how the new problem generalizes the original one and under which conditions it is easily solvable. By formulating the problem under the energy-efficiency perspective the network’s lifetime is significantly improved. Resumen La proliferación de las redes inalámbricas de sensores junto con la gran variedad de posi¬bles aplicaciones relacionadas, han motivado el desarrollo de herramientas y algoritmos necesarios para el procesado cooperativo en sistemas distribuidos. Una de las aplicaciones que suscitado mayor interés entre la comunidad científica es la de localization, donde el conjunto de nodos de la red intenta estimar la posición de un blanco localizado dentro de su área de cobertura. El problema de la localization es especialmente desafiante cuando se usan niveles de energía de la seal recibida (RSSI por sus siglas en inglés) como medida para la localization. El principal inconveniente reside en el hecho que el nivel de señal recibida no sigue una relación lineal con la posición del blanco. Muchas de las soluciones actuales al problema de localization usando RSSI se basan en complejos esquemas centralizados como filtros de partículas, mientas que en otras se basan en esquemas mucho más simples pero con menor precisión. Además, en muchos casos las estrategias son centralizadas lo que resulta poco prácticos para su implementación en redes de sensores. Desde un punto de vista práctico y de implementation, es conveniente, para ciertos escenarios y aplicaciones, el desarrollo de alternativas que ofrezcan un compromiso entre complejidad y precisión. En esta línea, en lugar de abordar directamente el problema de la estimación de la posición del blanco bajo el criterio de máxima verosimilitud, proponemos usar una formulación subóptima del problema más manejable analíticamente y que ofrece la ventaja de permitir en¬contrar la solución al problema de localization de una forma totalmente distribuida, convirtiéndola así en una solución atractiva dentro del contexto de redes inalámbricas de sensores. Para ello, se usan herramientas de procesado distribuido como los algorit¬mos de consenso y de optimización convexa en sistemas distribuidos. Para aplicaciones donde se requiera de un mayor grado de precisión se propone una estrategia que con¬siste en la optimización local de la función de verosimilitud entorno a la estimación inicialmente obtenida. Esta optimización se puede realizar de forma descentralizada usando una versión basada en consenso del método de Gauss-Newton siempre y cuando asumamos independencia de los ruidos de medida en los diferentes nodos. Independientemente de la aplicación subyacente de la red de sensores, es necesario tener un mecanismo que permita recopilar los datos provenientes de la red de sensores. Una forma de hacerlo es mediante el uso de uno o varios nodos especiales, llamados nodos “sumidero”, (sink en inglés) que actúen como centros recolectores de información y que estarán equipados con hardware adicional que les permita la interacción con el exterior de la red. La principal desventaja de esta estrategia es que dichos nodos se convierten en cuellos de botella en cuanto a tráfico y capacidad de cálculo. Como alter¬nativa se pueden usar técnicas cooperativas de conformación de haz (beamforming en inglés) de manera que el conjunto de la red puede verse como un único sistema virtual de múltiples antenas y, por tanto, que exploten los beneficios que ofrecen las comu¬nicaciones con múltiples antenas. Para ello, los distintos nodos de la red sincronizan sus transmisiones de manera que se produce una interferencia constructiva en el recep¬tor. No obstante, las actuales técnicas se basan en resultados promedios y asintóticos, cuando el número de nodos es muy grande. Para una configuración específica se pierde el control sobre el diagrama de radiación causando posibles interferencias sobre sis¬temas coexistentes o gastando más potencia de la requerida. La eficiencia energética es una cuestión capital en las redes inalámbricas de sensores ya que los nodos están equipados con baterías. Es por tanto muy importante preservar la batería evitando cambios innecesarios y el consecuente aumento de costes. Bajo estas consideraciones, se propone un esquema de conformación de haz que maximice el tiempo de vida útil de la red, entendiendo como tal el máximo tiempo que la red puede estar operativa garantizando unos requisitos de calidad de servicio (QoS por sus siglas en inglés) que permitan una decodificación fiable de la señal recibida en la estación base. Se proponen además algoritmos distribuidos que convergen a la solución centralizada. Inicialmente se considera que la única causa de consumo energético se debe a las comunicaciones con la estación base. Este modelo de consumo energético es modificado para tener en cuenta otras formas de consumo de energía derivadas de procesos inherentes al funcionamiento de la red como la adquisición y procesado de datos, las comunicaciones locales entre nodos, etc. Dicho consumo adicional de energía se modela como una variable aleatoria en cada nodo. Se cambia por tanto, a un escenario probabilístico que generaliza el caso determinista y se proporcionan condiciones bajo las cuales el problema se puede resolver de forma eficiente. Se demuestra que el tiempo de vida de la red mejora de forma significativa usando el criterio propuesto de eficiencia energética.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ageing of the population is a worldwide phenomenon. Numerous ICT-based solutions have been developed for elderly care but mainly connected to the physiological and nursing aspects in services for the elderly. Social work is a profession that should pay attention to the comprehensive wellbeing and social needs of the elderly. Many people experience loneliness and depression in their old age, either as a result of living alone or due to a lack of close family ties and reduced connections with their culture of origin, which results in an inability to participate actively in community activities (Singh & Misra, 2009). Participation in society would enhance the quality of life. With the development of information technology, the use of technology in social work practice has risen dramatically. The aim of this literature review is to map out the state of the art of knowledge about the usage of ICT in elderly care and to figure out research-based knowledge about the usability of ICT for the prevention of loneliness and social isolation of elderly people. The data for the current research comes from the core collection of the Web of Science and the data searching was performed using Boolean? The searching resulted in 216 published English articles. After going through the topics and abstracts, 34 articles were selected for the data analysis that is based on a multi approach framework. The analysis of the research approach is categorized according to some aspects of using ICT by older adults from the adoption of ICT to the impact of usage, and the social services for them. This literature review focused on the function of communication by excluding the applications that mainly relate to physical nursing. The results show that the so-called ‘digital divide’ still exists, but the older adults have the willingness to learn and utilise ICT in daily life, especially for communication. The data shows that the usage of ICT can prevent the loneliness and social isolation of older adults, and they are eager for technical support in using ICT. The results of data analysis on theoretical frames and concepts show that this research field applies different theoretical frames from various scientific fields, while a social work approach is lacking. However, a synergic frame of applied theories will be suggested from the perspective of social work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a multiple robots formation manoeuvring and its collision avoidance strategy. The direction priority sequential selection algorithm is employed to achieve the raw path, and a new algorithm is then proposed to calculate the turning compliant waypoints supporting the multi-robot formation manoeuvre. The collision avoidance strategy based on the formation control is presented to translate the collision avoidance problem into the stability problem of the formation. The extension-decomposition-aggregation scheme is next applied to solve the formation control problem and subsequently achieve the collision avoidance during the formation manoeuvre. Simulation study finally shows that the collision avoidance problem can be conveniently solved if the stability of the constructed formation including unidentified objects can be satisfied.