822 resultados para Data transmission systems.


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents an Advanced Traveler Information System (ATIS) developed on Android platform, which is open source and free. The developed application has as its main objective the free use of a Vehicle-to- Infrastructure (V2I) communication through the wireless network access points available in urban centers. In addition to providing the necessary information for an Intelligent Transportation System (ITS) to a central server, the application also receives the traffic data close to the vehicle. Once obtained this traffic information, the application displays them to the driver in a clear and efficient way, allowing the user to make decisions about his route in real time. The application was tested in a real environment and the results are presented in the article. In conclusion we present the benefits of this application. © 2012 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modal analysis is widely approached in the classic theory of power systems modelling. This technique is also applied to model multiconductor transmission lines and their self and mutual electrical parameters. However, this methodology has some particularities and inaccuracies for specific applications, which are not clearly described in the technical literature. This study provides a brief review on modal decoupling applied in transmission line digital models and thereafter a novel and simplified computational routine is proposed to overcome the possible errors embedded by the modal decoupling in the simulation/ modelling computational algorithm. © The Institution of Engineering and Technology 2013.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In soil surveys, several sampling systems can be used to define the most representative sites for sample collection and description of soil profiles. In recent years, the conditioned Latin hypercube sampling system has gained prominence for soil surveys. In Brazil, most of the soil maps are at small scales and in paper format, which hinders their refinement. The objectives of this work include: (i) to compare two sampling systems by conditioned Latin hypercube to map soil classes and soil properties; (II) to retrieve information from a detailed scale soil map of a pilot watershed for its refinement, comparing two data mining tools, and validation of the new soil map; and (III) to create and validate a soil map of a much larger and similar area from the extrapolation of information extracted from the existing soil map. Two sampling systems were created by conditioned Latin hypercube and by the cost-constrained conditioned Latin hypercube. At each prospection place, soil classification and measurement of the A horizon thickness were performed. Maps were generated and validated for each sampling system, comparing the efficiency of these methods. The conditioned Latin hypercube captured greater variability of soils and properties than the cost-constrained conditioned Latin hypercube, despite the former provided greater difficulty in field work. The conditioned Latin hypercube can capture greater soil variability and the cost-constrained conditioned Latin hypercube presents great potential for use in soil surveys, especially in areas of difficult access. From an existing detailed scale soil map of a pilot watershed, topographical information for each soil class was extracted from a Digital Elevation Model and its derivatives, by two data mining tools. Maps were generated using each tool. The more accurate of these tools was used for extrapolation of soil information for a much larger and similar area and the generated map was validated. It was possible to retrieve the existing soil map information and apply it on a larger area containing similar soil forming factors, at much low financial cost. The KnowledgeMiner tool for data mining, and ArcSIE, used to create the soil map, presented better results and enabled the use of existing soil map to extract soil information and its application in similar larger areas at reduced costs, which is especially important in development countries with limited financial resources for such activities, such as Brazil.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) are generally used to monitor hazardous events in inaccessible areas. Thus, on one hand, it is preferable to assure the adoption of the minimum transmission power in order to extend as much as possible the WSNs lifetime. On the other hand, it is crucial to guarantee that the transmitted data is correctly received by the other nodes. Thus, trading off power optimization and reliability insurance has become one of the most important concerns when dealing with modern systems based on WSN. In this context, we present a transmission power self-optimization (TPSO) technique for WSNs. The TPSO technique consists of an algorithm able to guarantee the connectivity as well as an equally high quality of service (QoS), concentrating on the WSNs efficiency (Ef), while optimizing the transmission power necessary for data communication. Thus, the main idea behind the proposed approach is to trade off WSNs Ef against energy consumption in an environment with inherent noise. Experimental results with different types of noise and electromagnetic interference (EMI) have been explored in order to demonstrate the effectiveness of the TPSO technique.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives. The null hypothesis was that mechanical testing systems used to determine polymerization stress (sigma(pol)) would rank a series of composites similarly. Methods. Two series of composites were tested in the following systems: universal testing machine (UTM) using glass rods as bonding substrate, UTM/acrylic rods, "low compliance device", and single cantilever device ("Bioman"). One series had five experimental composites containing BisGMA:TEGDMA in equimolar concentrations and 60, 65, 70, 75 or 80 wt% of filler. The other series had five commercial composites: Filtek Z250 (3M ESPE), Filtek A110 (3M ESPE), Tetric Ceram (Ivoclar), Heliomolar (Ivoclar) and Point 4 (Kerr). Specimen geometry, dimensions and curing conditions were similar in all systems. sigma(pol) was monitored for 10 min. Volumetric shrinkage (VS) was measured in a mercury dilatometer and elastic modulus (E) was determined by three-point bending. Shrinkage rate was used as a measure of reaction kinetics. ANOVA/Tukey test was performed for each variable, separately for each series. Results. For the experimental composites, sigma(pol) decreased with filler content in all systems, following the variation in VS. For commercial materials, sigma(pol) did not vary in the UTM/acrylic system and showed very few similarities in rankings in the others tests system. Also, no clear relationships were observed between sigma(pol) and VS or E. Significance. The testing systems showed a good agreement for the experimental composites, but very few similarities for the commercial composites. Therefore, comparison of polymerization stress results from different devices must be done carefully. (c) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current scientific applications have been producing large amounts of data. The processing, handling and analysis of such data require large-scale computing infrastructures such as clusters and grids. In this area, studies aim at improving the performance of data-intensive applications by optimizing data accesses. In order to achieve this goal, distributed storage systems have been considering techniques of data replication, migration, distribution, and access parallelism. However, the main drawback of those studies is that they do not take into account application behavior to perform data access optimization. This limitation motivated this paper which applies strategies to support the online prediction of application behavior in order to optimize data access operations on distributed systems, without requiring any information on past executions. In order to accomplish such a goal, this approach organizes application behaviors as time series and, then, analyzes and classifies those series according to their properties. By knowing properties, the approach selects modeling techniques to represent series and perform predictions, which are, later on, used to optimize data access operations. This new approach was implemented and evaluated using the OptorSim simulator, sponsored by the LHC-CERN project and widely employed by the scientific community. Experiments confirm this new approach reduces application execution time in about 50 percent, specially when handling large amounts of data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new method for analysis of scattering data from lamellar bilayer systems is presented. The method employs a form-free description of the cross-section structure of the bilayer and the fit is performed directly to the scattering data, introducing also a structure factor when required. The cross-section structure (electron density profile in the case of X-ray scattering) is described by a set of Gaussian functions and the technique is termed Gaussian deconvolution. The coefficients of the Gaussians are optimized using a constrained least-squares routine that induces smoothness of the electron density profile. The optimization is coupled with the point-of-inflection method for determining the optimal weight of the smoothness. With the new approach, it is possible to optimize simultaneously the form factor, structure factor and several other parameters in the model. The applicability of this method is demonstrated by using it in a study of a multilamellar system composed of lecithin bilayers, where the form factor and structure factor are obtained simultaneously, and the obtained results provided new insight into this very well known system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

[EN] This work makes a theoretical–experimental contribution to the study of ester and alkane solutions. Experimental data of isobaric vapor–liquid equilibria (VLE) are presented at 101.3 kPa for binary systems of methyl ethanoate with six alkanes (from C5 to C10), and of volumes and mixing enthalpies, vE and hE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Salmonella and Campylobacter are common causes of human gastroenteritis. Their epidemiology is complex and a multi-tiered approach to control is needed, taking into account the different reservoirs, pathways and risk factors. In this thesis, trends in human gastroenteritis and food-borne outbreak notifications in Italy were explored. Moreover, the improved sensitivity of two recently-implemented regional surveillance systems in Lombardy and Piedmont was evidenced, providing a basis for improving notification at the national level. Trends in human Salmonella serovars were explored: serovars Enteritidis and Infantis decreased, Typhimurium remained stable and 4,[5],12:i:-, Derby and Napoli increased, suggesting that sources of infection have changed over time. Attribution analysis identified pigs as the main source of human salmonellosis in Italy, accounting for 43–60% of infections, followed by Gallus gallus (18–34%). Attributions to pigs and Gallus gallus showed increasing and decreasing trends, respectively. Potential bias and sampling issues related to the use of non-local/non-recent multilocus sequence typing (MLST) data in Campylobacter jejuni/coli source attribution using the Asymmetric Island (AI) model were investigated. As MLST data become increasingly dissimilar with increasing geographical/temporal distance, attributions to sources not sampled close to human cases can be underestimated. A combined case-control and source attribution analysis was developed to investigate risk factors for human Campylobacter jejuni/coli infection of chicken, ruminant, environmental, pet and exotic origin in The Netherlands. Most infections (~87%) were attributed to chicken and cattle. Individuals infected from different reservoirs had different associated risk factors: chicken consumption increased the risk for chicken-attributed infections; animal contact, barbecuing, tripe consumption, and never/seldom chicken consumption increased that for ruminant-attributed infections; game consumption and attending swimming pools increased that for environment-attributed infections; and dog ownership increased that for environment- and pet-attributed infections. Person-to-person contacts around holiday periods were risk factors for infections with exotic strains, putatively introduced by returning travellers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

User interfaces are key properties of Business-to-Consumer (B2C) systems, and Web-based reservation systems are an important class of B2C systems. In this paper we show that these systems use a surprisingly broad spectrum of different approaches to handling temporal data in their Web inter faces. Based on these observations and on a literature analysis we develop a Morphological Box to present the main options for handling temporal data and give examples. The results indicate that the present state of developing and maintaining B2C systems has not been much influenced by modern Web Engi neering concepts and that there is considerable potential for improvement.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Chlamydia trachomatis is the most common bacterial sexually transmitted infection (STI) in many developed countries. The highest prevalence rates are found among young adults who have frequent partner change rates. Three published individual-based models have incorporated a detailed description of age-specific sexual behaviour in order to quantify the transmission of C. trachomatis in the population and to assess the impact of screening interventions. Owing to varying assumptions about sexual partnership formation and dissolution and the great uncertainty about critical parameters, such models show conflicting results about the impact of preventive interventions. Here, we perform a detailed evaluation of these models by comparing the partnership formation and dissolution dynamics with data from Natsal 2000, a population-based probability sample survey of sexual attitudes and lifestyles in Britain. The data also allow us to describe the dispersion of C. trachomatis infections as a function of sexual behaviour, using the Gini coefficient. We suggest that the Gini coefficient is a useful measure for calibrating infectious disease models that include risk structure and highlight the need to estimate this measure for other STIs.