925 resultados para advantages of networking


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Among the many advantages of the recently proposed ion beam shepherd (IBS) debris removal technique is the capability to deal with multiple targets in a single mission. A preliminary analysis is here conducted in order to estimate the cost in terms of spacecraft mass and total mission time to remove multiple large-size upper stages of the Zenit family. Zenit-2 upper stages are clustered at 71 degrees inclination around 850 km altitude in low Earth orbit. It is found that a removal of two targets per year is feasible with a modest size spacecraft. The most favorable combinations of targets are outlined.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is essential to remotely and continuously monitor the movements of individuals in many social areas, for example, taking care of aging people, physical therapy, athletic training etc. Many methods have been used, such as video record, motion analysis or sensor-based methods. Due to the limitations in remote communication, power consumption, portability and so on, most of them are not able to fulfill the requirements. The development of wearable technology and cloud computing provides a new efficient way to achieve this goal. This paper presents an intelligent human movement monitoring system based on a smartwatch, an Android smartphone and a distributed data management engine. This system includes advantages of wide adaptability, remote and long-term monitoring capacity, high portability and flexibility. The structure of the system and its principle are introduced. Four experiments are designed to prove the feasibility of the system. The results of the experiments demonstrate the system is able to detect different actions of individuals with adequate accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The success of an aquaculture breeding program critically depends on the way in which the base population of breeders is constructed since all the genetic variability for the traits included originally in the breeding goal as well as those to be included in the future is contained in the initial founders. Traditionally, base populations were created from a number of wild strains by sampling equal numbers from each strain. However, for some aquaculture species improved strains are already available and, therefore, mean phenotypic values for economically important traits can be used as a criterion to optimize the sampling when creating base populations. Also, the increasing availability of genome-wide genotype information in aquaculture species could help to refine the estimation of relationships within and between candidate strains and, thus, to optimize the percentage of individuals to be sampled from each strain. This study explores the advantages of using phenotypic and genome-wide information when constructing base populations for aquaculture breeding programs in terms of initial and subsequent trait performance and genetic diversity level. Results show that a compromise solution between diversity and performance can be found when creating base populations. Up to 6% higher levels of phenotypic performance can be achieved at the same level of global diversity in the base population by optimizing the selection of breeders instead of sampling equal numbers from each strain. The higher performance observed in the base population persisted during 10 generations of phenotypic selection applied in the subsequent breeding program.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this contribution is to study the modifications of Cascade, comparing them with the original protocol on the grounds of a full set of parameters, so that the effect of these modifications can be fairly assessed. A number of simulations were performed to study not only the efficiency but also other characteristics of the protocol that are important for its practical application, such as the number of communications and the failure probability. Note that, although it is generally believed that the only price to pay for an improved efficiency is an increased interactivity, when looking at all the significant magnitudes a different view emerges, showing that, for instance, the failure probability eliminate some the supposed advantages of these improvements.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim of study: This paper presents a novel index, the Riparian Forest Evaluation (RFV) index, for assessing the ecological condition of riparian forests. The status of riparian ecosystems has global importance due to the ecological and social benefits and services they provide. The initiation of the European Water Framework Directive (2000/60/CE) requires the assessment of the hydromorphological quality of natural channels. The Directive describes riparian forests as one of the fundamental components that determine the structure of riverine areas. The RFV index was developed to meet the aim of the Directive and to complement the existing methodologies for the evaluation of riparian forests. Area of study: The RFV index was applied to a wide range of streams and rivers (170 water bodies) inSpain. Materials and methods: The calculation of the RFV index is based on the assessment of both the spatial continuity of the forest (in its three core dimensions: longitudinal, transversal and vertical) and the regeneration capacity of the forest, in a sampling area related to the river hydromorphological pattern. This index enables an evaluation of the quality and degree of alteration of riparian forests. In addition, it helps to determine the scenarios that are necessary to improve the status of riparian forests and to develop processes for restoring their structure and composition. Main results: The results were compared with some previous tools for the assessment of riparian vegetation. The RFV index got the highest average scores in the basins of northernSpain, which suffer lower human influence. The forests in central and southern rivers got worse scores. The bigger differences with other tools were found in complex and partially altered streams and rivers. Research highlights: The study showed the index’s applicability under diverse hydromorphological and ecological conditions and the main advantages of its application. The utilization of the index allows a better understanding of the status of riparian forests, and enhances improvements in the conservation and management of riparian areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The computational advantages of the use of different approaches -numerical and analytical ones- to the analysis of different parts of the same shell structure are discussed. Examples of large size problems that can be reduced to those more suitable to be handled by a personal related axisyrometric finite elements, local unaxisymmetric shells, geometric quasi-regular shells, infinite elements and homogenization techniques are described

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Una vez presentada la tecnología de Networking audio (redes de datos, protocolos actuales, etc.) se realizará un diseño de la instalación del sistema de audio, en el que el punto de partida es la parte creativa de la actividad en dicha instalación: un juego en el que la comunicación auditiva es lo fundamental. La instalación se compondrá de una sala central, tres salas de grupos, tres salas de cabinas de actores y ocho salas de pasaje. Esta actividad tan particular hará plantearse configuraciones, equipamiento y formas de trabajar especiales que, mediante la tecnología de audio vía red de datos y el equipamiento auxiliar a esta red, podría realizarse de la una forma óptima cumpliendo con todos los objetivos de la actividad, tanto técnicos como relativos al juego. El libro se dividirá en dos partes: La primera parte consistirá en una explicación de lo que son las redes de datos y los aspectos básicos para entenderlas desde un punto de vista práctico: qué es Ethernet, los componentes de una red... Una vez explicada la terminología específica de redes, se expondrán los protocolos que se usan para transmitir audio profesional a día de hoy. En la segunda parte, se empezará presentando la actividad que se realizará en nuestra instalación: un juego de rol. A continuación se conocerá el flujo de señales existentes para después, poner en práctica lo aprendido en la primera parte: diseñaremos una instalación audiovisual mediante networking audio. Un sistema de estas características necesita además de dispositivos en red, sistemas convencionales de audio. Durante el diseño y debido a las necesidades tan específicas de la instalación, se verá que ha sido necesario pensar en sistemas especiales para hacer posible la actividad para la que ha sido ideada nuestra instalación. Los objetivos de este proyecto son, desarrollar los puntos que tendría que tener en cuenta un integrador que se proponga diseñar un sistema de audio networking para una instalación audiovisual para, a continuación, poner en práctica estos conocimientos con la exposición del diseño de una instalación en la que se llevará a cabo una actividad lúdica y de aprendizaje en la que una óptima transmisión de señal de audio a tiempo real, es lo fundamental. ABSTRACT. Once introduced the Networking technology (data networks, current protocols, etc.), the audio installation design is being done. In which the starting point is the creative part of the activity will be made: one game in which the auditory communication is fundamental. The installation will consist of a central room, three meeting groups, three actor cabins rooms and eight passage rooms. This particular activity will consider configurations, equipment and forms of special working that through audio technology via data network and auxiliary equipment to this network, it could be done in an optimal way to meet all the goals of the activity, both technical and relative to the game. The book is divided into two parts: The first part consists of an explanation of what the data networks and the basics to understand from a practical point of view: what Ethernet is, the network components... Once specific network terminology is explained, the current protocols used to transmit professional audio are being showed. In the second part, it is introducing the activity to be made in our installation: a game. Then, the flow of existing signals are being known, we practice what I learned in the first part: we will design an audiovisual installation by audio networking. A system like this besides networked devices, it needs conventional audio systems. During the design and due to the very specific needs of the installation, you will see that it was necessary to think of special systems for this special activity. The goals of this project are to develop the points that an system integrator would have to consider to design a system of networking audio for an audiovisual installation, then put this knowledge into practice with the installation design where it will take place a fun and learning activity in which an optimal transmission of audio signal in real time, is basic.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the process of design and development of an autonomous Multi-UAV System, two main problems appear. The first one is the difficulty of designing all the modules and behaviors of the aerial multi-robot system. The second one is the difficulty of having an autonomous prototype of the system for the developers that allows to test the performance of each module even in an early stage of the project. These two problems motivate this paper. A multipurpose system architecture for autonomous multi-UAV platforms is presented. This versatile system architecture can be used by the system designers as a template when developing their own systems. The proposed system architecture is general enough to be used in a wide range of applications, as demonstrated in the paper. This system architecture aims to be a reference for all designers. Additionally, to allow for the fast prototyping of autonomous multi-aerial systems, an Open Source framework based on the previously defined system architecture is introduced. It allows developers to have a flight proven multi-aerial system ready to use, so that they can test their algorithms even in an early stage of the project. The implementation of this framework, introduced in the paper with the name of “CVG Quadrotor Swarm”, which has also the advantages of being modular and compatible with different aerial platforms, can be found at https://​github.​com/​Vision4UAV/​cvg_​quadrotor_​swarm with a consistent catalog of available modules. The good performance of this framework is demonstrated in the paper by choosing a basic instance of it and carrying out simulation and experimental tests whose results are summarized and discussed in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We describe a novel plant transformation technique, termed “agrolistic,” that combines the advantages of the Agrobacterium transformation system with the high efficiency of biolistic DNA delivery. Agrolistic transformation allows integration of the gene of interest without undesired vector sequence. The virulence genes virD1 and virD2 from Agrobacterium tumefaciens that are required in bacteria for excision of T-strands from the tumor-inducing plasmid were placed under the control of the CaMV35S promoter and codelivered with a target plasmid containing border sequences flanking the gene of interest. Transient expression assays in tobacco and in maize cells indicated that vir gene products caused strand-specific nicking in planta at the right border sequence, similar to VirD1/VirD2-catalyzed T-strand excision observed in Agrobacterium. Agrolistically transformed tobacco calli were obtained after codelivery of virD1 and virD2 genes together with a selectable marker flanked by border sequences. Some inserts exhibited right junctions with plant DNA that corresponded precisely to the sequence expected for T-DNA (portion of the tumor-inducing plasmid that is transferred to plant cells) insertion events. We designate these as “agrolistic” inserts, as distinguished from “biolistic” inserts. Both types of inserts were found in some transformed lines. The frequency of agrolistic inserts was 20% that of biolistic inserts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We report a general method for screening, in solution, the impact of deviations from canonical Watson-Crick composition on the thermodynamic stability of nucleic acid duplexes. We demonstrate how fluorescence resonance energy transfer (FRET) can be used to detect directly free energy differences between an initially formed “reference” duplex (usually a Watson-Crick duplex) and a related “test” duplex containing a lesion/alteration of interest (e.g., a mismatch, a modified, a deleted, or a bulged base, etc.). In one application, one titrates into a solution containing a fluorescently labeled, FRET-active, reference duplex, an unlabeled, single-stranded nucleic acid (test strand), which may or may not compete successfully to form a new duplex. When a new duplex forms by strand displacement, it will not exhibit FRET. The resultant titration curve (normalized fluorescence intensity vs. logarithm of test strand concentration) yields a value for the difference in stability (free energy) between the newly formed, test strand-containing duplex and the initial reference duplex. The use of competitive equilibria in this assay allows the measurement of equilibrium association constants that far exceed the magnitudes accessible by conventional titrimetric techniques. Additionally, because of the sensitivity of fluorescence, the method requires several orders of magnitude less material than most other solution methods. We discuss the advantages of this method for detecting and characterizing any modification that alters duplex stability, including, but not limited to, mutagenic lesions. We underscore the wide range of accessible free energy values that can be defined by this method, the applicability of the method in probing for a myriad of nucleic acid variations, such as single nucleotide polymorphisms, and the potential of the method for high throughput screening.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The worldwide threat of tuberculosis to human health emphasizes the need to develop novel approaches to a global epidemiological surveillance. The current standard for Mycobacterium tuberculosis typing based on IS6110 restriction fragment length polymorphism (RFLP) suffers from the difficulty of comparing data between independent laboratories. Here, we propose a high-resolution typing method based on variable number tandem repeats (VNTRs) of genetic elements named mycobacterial interspersed repetitive units (MIRUs) in 12 human minisatellite-like regions of the M. tuberculosis genome. MIRU-VNTR profiles of 72 different M. tuberculosis isolates were established by PCR analysis of all 12 loci. From 2 to 8 MIRU-VNTR alleles were identified in the 12 regions in these strains, which corresponds to a potential of over 16 million different combinations, yielding a resolution power close to that of IS6110-RFLP. All epidemiologically related isolates tested were perfectly clustered by MIRU-VNTR typing, indicating that the stability of these MIRU-VNTRs is adequate to track outbreak episodes. The correlation between genetic relationships inferred from MIRU-VNTR and IS6110-RFLP typing was highly significant. Compared with IS6110-RFLP, high-resolution MIRU-VNTR typing has the considerable advantages of being fast, appropriate for all M. tuberculosis isolates, including strains that have a few IS6110 copies, and permitting easy and rapid comparison of results from independent laboratories. This typing method opens the way to the construction of digital global databases for molecular epidemiology studies of M. tuberculosis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A librarian/clinician partnership was fostered in one hospital through the formation of the Evidence-based Practice Committee, with an ulterior goal of facilitating the transfer of evidence into practice. The paper will describe barriers to evidence-based practice and outline the committee's strategies for overcoming these barriers, including the development and promotion of a Web-based guide to evidence-based practice specifically designed for clinicians (health professionals). Educational strategies for use of the Web-based guide will also be addressed. Advantages of this partnership are that the skills of librarians in meeting the needs of clinicians are maximized. The evidence-based practice skills of clinicians are honed and librarians make a valuable contribution to the knowledgebase of the clinical staff. The knowledge acquired through the partnership by both clinicians and librarians will increase the sophistication of the dialogue between the two groups and in turn will expedite the transfer of evidence into practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical advantages of nonparametric logarithm of odds to map polygenic diseases are supported by tests of the beta model that depends on a single logistic parameter and is the only model under which paternal and maternal transmissions to sibs of specified phenotypes are independent. Although it does not precisely describe recurrence risks in monozygous twins, the beta model has greater power to detect family resemblance or linkage than the more general delta model which describes the probability of 0, 1, or 2 alleles identical by descent (ibd) with two parameters. Available data on ibd in sibs are consistent with the beta model, but not with the equally parsimonious but less powerful gamma model that assumes a fixed probability of 1/2 for 1 allele ibd. Additivity of loci on the liability scale is not disproven. A simple equivalence extends the beta model to multipoint analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A genetic approach has been established that combines the advantages of blastocyst complementation with the experimental attributes of the developing lens for the functional analysis of genes governing cellular proliferation, terminal differentiation, and apoptosis. This lens complementation system (LCS) makes use of a mutant mouse strain, aphakia (ak), homozygotes of which fail to develop an ocular lens. We demonstrate that microinjection of wild-type embryonic stem (ES) cells into ak/ak blastocysts produces chimeras with normal ES-cell-derived lenses and that microinjection of Rb-/- ES cells generates an aberrant lens phenotype identical to that obtained through conventional gene targeting methodology. Our determination that a cell autonomous defect underlies the aphakia condition assures that lenses generated through LCS are necessarily ES-cell-derived. LCS provides for the rapid phenotypic analysis of loss-of-function mutations, circumvents the need for germ-line transmission of null alleles, and, most significantly, facilitates the study of essential genes whose inactivation is associated with early lethal phenotypes.