916 resultados para clouds


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new methodology focused on implementing cost effective architectures on Cloud Computing systems. With this methodology the paper presents some disadvantages of systems that are based on single Cloud architectures and gives some advices for taking into account in the development of hybrid systems. The work also includes a validation of these ideas implemented in a complete videoconference service developed with our research group. This service allows a great number of users per conference, multiple simultaneous conferences, different client software (requiring transcodification of audio and video flows) and provides a service like automatic recording. Furthermore it offers different kinds of connectivity including SIP clients and a client based on Web 2.0. The ideas proposed in this article are intended to be a useful resource for any researcher or developer who wants to implement cost effective systems on several Clouds

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Real time Tritium concentrations in air coming from an ITER-like reactor as source were coupled the European Centre Medium Range Weather Forecast (ECMWF) numerical model with the lagrangian atmospheric dispersion model FLEXPART. This tool ECMWF/FLEXPART was analyzed in normal operating conditions in the Western Mediterranean Basin during 45 days at summer 2010. From comparison with NORMTRI plumes over Western Mediterranean Basin the real time results have demonstrated an overestimation of the corresponding climatologically sequence Tritium concentrations in air outputs, at several distances from the reactor. For these purpose two clouds development patterns were established. The first one was following a cyclonic circulation over the Mediterranean Sea and the second one was based in the cloud delivered over the Interior of the Iberian Peninsula by another stabilized circulation corresponding to a High. One of the important remaining activities defined then, was the tool qualification. The aim of this paper is to present the ECMWF/FLEXPART products confronted with Tritium concentration in air data. For this purpose a database to develop and validate ECMWF/FLEXPART tritium in both assessments has been selected from a NORMTRI run. Similarities and differences, underestimation and overestimation with NORMTRI will allowfor refinement in some features of ECMWF/FLEXPART

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is one the most relevant computing paradigms available nowadays. Its adoption has increased during last years due to the large investment and research from business enterprises and academia institutions. Among all the services cloud providers usually offer, Infrastructure as a Service has reached its momentum for solving HPC problems in a more dynamic way without the need of expensive investments. The integration of a large number of providers is a major goal as it enables the improvement of the quality of the selected resources in terms of pricing, speed, redundancy, etc. In this paper, we propose a system architecture, based on semantic solutions, to build an interoperable scheduler for federated clouds that works with several IaaS (Infrastructure as a Service) providers in a uniform way. Based on this architecture we implement a proof-of-concept prototype and test it with two different cloud solutions to provide some experimental results about the viability of our approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper considers short-term releases of tritium (mainly but not only tritium hydride (HT)) to the atmosphere from a potential ITER-like fusion reactor located in the Mediterranean Basin and explores if the short range legal exposure limits are exceeded (both locally and downwind). For this, a coupled Lagrangian ECMWF/FLEXPART model has been used to follow real time releases of tritium. This tool was analyzed for nominal tritium operational conditions under selected incidental conditions to determine resultant local and Western Mediterranean effects, together with hourly observations of wind, to provide a short-range approximation of tritium cloud behavior. Since our results cannot be compared with radiological station measurements of tritium in air, we use the NORMTRI Gaussian model. We demonstrate an overestimation of the sequence of tritium concentrations in the atmosphere, close to the reactor, estimated with this model when compared with ECMWF/FLEXPART results. A Gaussian “mesoscale” qualification tool has been used to validate the ECMWF/FLEXPART for winter 2010/spring 2011 with a database of the HT plumes. It is considered that NORMTRI allows evaluation of tritium-in-air-plume patterns and its contribution to doses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper tackles the optimization of applications in multi-provider hybrid cloud scenarios from an economic point of view. In these scenarios the great majority of solutions offer the automatic allocation of resources on different cloud providers based on their current prices. However our approach is intended to introduce a novel solution by making maximum use of divide and rule. This paper describes a methodology to create cost aware cloud applications that can be broken down into the three most important components in cloud infrastructures: computation, network and storage. A real videoconference system has been modified in order to evaluate this idea with both theoretical and empirical experiments. This system has become a widely used tool in several national and European projects for e-learning and collaboration purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing and, more particularly, private IaaS, is seen as a mature technol- ogy with a myriad solutions to choose from. However, this disparity of solutions and products has instilled in potential adopters the fear of vendor and data lock- in. Several competing and incompatible interfaces and management styles have increased even more these fears. On top of this, cloud users might want to work with several solutions at the same time, an integration that is difficult to achieve in practice. In this Master Thesis I propose a management architecture that tries to solve these problems; it provides a generalized control mechanism for several cloud infrastructures, and an interface that can meet the requirements of the users. This management architecture is designed in a modular way, and using a generic infor- mation model. I have validated the approach through the implementation of the components needed for this architecture to support a sample private IaaS solution: OpenStack.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vector reconstruction of objects from an unstructured point cloud obtained with a LiDAR-based system (light detection and ranging) is one of the most promising methods to build three dimensional models of orchards. The cylinder fitting method for woody structure reconstruction of leafless trees from point clouds obtained with a mobile terrestrial laser scanner (MTLS) has been analysed. The advantage of this method is that it performs reconstruction in a single step. The most time consuming part of the algorithm is generation of the cylinder direction, which must be recalculated at the inclusion of each point in the cylinder. The tree skeleton is obtained at the same time as the cluster of cylinders is formed. The method does not guarantee a unique convergence and the reconstruction parameter values must be carefully chosen. A balanced processing of clusters has also been defined which has proven to be very efficient in terms of processing time by following the hierarchy of branches, predecessors and successors. The algorithm was applied to simulated MTLS of virtual orchard models and to MTLS data of real orchards. The constraints applied in the method have been reviewed to ensure better convergence and simpler use of parameters. The results obtained show a correct reconstruction of the woody structure of the trees and the algorithm runs in linear logarithmic time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

3D crop reconstruction with a high temporal resolution and by the use of non-destructive measuring technologies can support the automation of plant phenotyping processes. Thereby, the availability of such 3D data can give valuable information about the plant development and the interaction of the plant genotype with the environment. This article presents a new methodology for georeferenced 3D reconstruction of maize plant structure. For this purpose a total station, an IMU, and several 2D LiDARs with different orientations were mounted on an autonomous vehicle. By the multistep methodology presented, based on the application of the ICP algorithm for point cloud fusion, it was possible to perform the georeferenced point clouds overlapping. The overlapping point cloud algorithm showed that the aerial points (corresponding mainly to plant parts) were reduced to 1.5%–9% of the total registered data. The remaining were redundant or ground points. Through the inclusion of different LiDAR point of views of the scene, a more realistic representation of the surrounding is obtained by the incorporation of new useful information but also of noise. The use of georeferenced 3D maize plant reconstruction at different growth stages, combined with the total station accuracy could be highly useful when performing precision agriculture at the crop plant level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La consolidación de las grandes infraestructuras para la Computación Distribuida ha resultado en una plataforma de Computación de Alta Productividad que está lista para grandes cargas de trabajo. Los mejores exponentes de este proceso son las federaciones grid actuales. Por otro lado, la Computación Cloud promete ser más flexible, utilizable, disponible y simple que la Computación Grid, cubriendo además muchas más necesidades computacionales que las requeridas para llevar a cabo cálculos distribuidos. En cualquier caso, debido al dinamismo y la heterogeneidad presente en grids y clouds, encontrar la asignación ideal de las tareas computacionales en los recursos disponibles es, por definición un problema NP-completo, y sólo se pueden encontrar soluciones subóptimas para estos entornos. Sin embargo, la caracterización de estos recursos en ambos tipos de infraestructuras es deficitaria. Los sistemas de información disponibles no proporcionan datos fiables sobre el estado de los recursos, lo cual no permite la planificación avanzada que necesitan los diferentes tipos de aplicaciones distribuidas. Durante la última década esta cuestión no ha sido resuelta para la Computación Grid y las infraestructuras cloud establecidas recientemente presentan el mismo problema. En este marco, los planificadores (brokers) sólo pueden mejorar la productividad de las ejecuciones largas, pero no proporcionan ninguna estimación de su duración. La planificación compleja ha sido abordada tradicionalmente por otras herramientas como los gestores de flujos de trabajo, los auto-planificadores o los sistemas de gestión de producción pertenecientes a ciertas comunidades de investigación. Sin embargo, el bajo rendimiento obtenido con estos mecanismos de asignación anticipada (early-binding) es notorio. Además, la diversidad en los proveedores cloud, la falta de soporte de herramientas de planificación y de interfaces de programación estandarizadas para distribuir la carga de trabajo, dificultan la portabilidad masiva de aplicaciones legadas a los entornos cloud...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method for fitting a series of Zernike polynomials to point clouds defined over connected domains of arbitrary shape defined within the unit circle is presented in this work. The method is based on the application of machine learning fitting techniques by constructing an extended training set in order to ensure the smooth variation of local curvature over the whole domain. Therefore this technique is best suited for fitting points corresponding to ophthalmic lenses surfaces, particularly progressive power ones, in non-regular domains. We have tested our method by fitting numerical and real surfaces reaching an accuracy of 1 micron in elevation and 0.1 D in local curvature in agreement with the customary tolerances in the ophthalmic manufacturing industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A cold atomic cloud is a versatile object, because it offers many handles to control and tune its properties. This facilitates studies of its behavior in various circumstances, such as sample temperature, size and density, composition, dimensionality and coherence time. The range of possible experiments is constrained by the specifications of the atomic species used. In this thesis presents the work done in the experiment for laser cooling of strontium atoms, focusing on its stability, which should provide cold and ultracold samples for the study of collective effects in light scattering. From the initial apparatus, innumerous changes were performed. The vacuum system got improved and now reached lower ultra high vacuum due to the pre-baking done to its parts and adding a titanium-sublimation stage. The quadrupole trap were improved by the design and construction of a new pair of coils. The stability of the blue, green and red laser systems and the loss prevention of laser light were improved, giving rise to a robust apparatus. Another important point is the development of homemade devices to reduce the costs and to be used as a monitor of different parts of an cold atoms experiment. From this homemade devices, we could demonstrate a dramatic linewidth narrowing by injection lock of an low cost 461 nm diode laser and its application to our strontium experiment. In the end, this improved experimental apparatus made possible the study of a new scattering effect, the mirror assisted coherent back-scattering (mCBS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Humans' desire for knowledge regarding animal species and their interactions with the natural world have spurred centuries of studies. The relatively new development of remote sensing systems using satellite or aircraft-borne sensors has opened up a wide field of research, which unfortunately largely remains dependent on coarse-scale image spatial resolution, particularly for habitat modeling. For habitat-specialized species, such data may not be sufficient to successfully capture the nuances of their preferred areas. Of particular concern are those species for which topographic feature attributes are a main limiting factor for habitat use. Coarse spatial resolution data can smooth over details that may be essential for habitat characterization. Three studies focusing on sea turtle nesting beaches were completed to serve as an example of how topography can be a main deciding factor for certain species. Light Detection and Ranging (LiDAR) data were used to illustrate that fine spatial scale data can provide information not readily captured by either field work or coarser spatial scale sources. The variables extracted from the LiDAR data could successfully model nesting density for loggerhead (Caretta caretta), green (Chelonia mydas), and leatherback (Dermochelys coriacea) sea turtle species using morphological beach characteristics, highlight beach changes over time and their correlations with nesting success, and provide comparisons for nesting density models across large geographic areas. Comparisons between the LiDAR dataset and other digital elevation models (DEMs) confirmed that fine spatial scale data sources provide more similar habitat information than those with coarser spatial scales. Although these studies focused solely on sea turtles, the underlying principles are applicable for many other wildlife species whose range and behavior may be influenced by topographic features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. We study the optical and near-infrared colour excesses produced by circumstellar emission in a sample of Be/X-ray binaries. Our main goals are exploring whether previously published relations, valid for isolated Be stars, are applicable to Be/X-ray binaries and computing the distance to these systems after correcting for the effects of the circumstellar contamination. Methods. Simultaneous UBVRI photometry and spectra in the 3500−7000 Å spectral range were obtained for 11 optical counterparts to Be/X-ray binaries in the LMC, 5 in the SMC and 12 in the Milky Way. As a measure of the amount of circumstellar emission we used the Hα equivalent width corrected for photospheric absorption. Results. We find a linear relationship between the strength of the Hα emission line and the component of E(B − V) originating from the circumstellar disk. This relationship is valid for stars with emission lines weaker than EW ≈ −15   Å. Beyond this point, the circumstellar contribution to E(B − V) saturates at a value ≈0.17   mag. A similar relationship is found for the (V − I) near infrared colour excess, albeit with a steeper slope and saturation level. The circumstellar excess in (B − V) is found to be about five times higher for Be/X-ray binaries than for isolated Be stars with the same equivalent width EW(Hα), implying significant differences in the physical properties of their circumstellar envelopes. The distance to Be/X-ray binaries (with non-shell Be star companions) can only be correctly estimated by taking into account the excess emission in the V band produced by free-free and free-bound transitions in the circumstellar envelope. We provide a simple method to determine the distances that includes this effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rock mass characterization requires a deep geometric understanding of the discontinuity sets affecting rock exposures. Recent advances in Light Detection and Ranging (LiDAR) instrumentation currently allow quick and accurate 3D data acquisition, yielding on the development of new methodologies for the automatic characterization of rock mass discontinuities. This paper presents a methodology for the identification and analysis of flat surfaces outcropping in a rocky slope using the 3D data obtained with LiDAR. This method identifies and defines the algebraic equations of the different planes of the rock slope surface by applying an analysis based on a neighbouring points coplanarity test, finding principal orientations by Kernel Density Estimation and identifying clusters by the Density-Based Scan Algorithm with Noise. Different sources of information —synthetic and 3D scanned data— were employed, performing a complete sensitivity analysis of the parameters in order to identify the optimal value of the variables of the proposed method. In addition, raw source files and obtained results are freely provided in order to allow to a more straightforward method comparison aiming to a more reproducible research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims. In this study we conduct a pilot program aimed at the red supergiant population of the Magellanic Clouds. We intend to extend the current known sample to the unexplored low end of the brightness distribution of these stars, building a more representative dataset with which to extrapolate their behaviour to other Galactic and extra-galactic environments. Methods. We select candidates using only near infrared photometry, and with medium resolution multi-object spectroscopy, we perform spectral classification and derive their line-of-sight velocities, confirming the nature of the candidates and their membership in the clouds. Results. Around two hundred new red supergiants have been detected, hinting at a yet to be observed large population. Using near- and mid-infrared photometry we study the brightness distribution of these stars, the onset of mass-loss, and the effect of dust in their atmospheres. Based on this sample, new a priori classification criteria are investigated, combining mid- and near-infrared photometry to improve the observational efficiency of similar programs to this.