982 resultados para cloud point


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment but usually the huge amount of 3D information is unmanageable by the robot storage and computing capabilities. A data compression is necessary to store and manage this information but preserving as much information as possible. In this paper, we propose a 3D lossy compression system based on plane extraction which represent the points of each scene plane as a Delaunay triangulation and a set of points/area information. The compression system can be customized to achieve different data compression or accuracy ratios. It also supports a color segmentation stage to preserve original scene color information and provides a realistic scene reconstruction. The design of the method provides a fast scene reconstruction useful for further visualization or processing tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A composite SaaS (Software as a Service) is a software that is comprised of several software components and data components. The composite SaaS placement problem is to determine where each of the components should be deployed in a cloud computing environment such that the performance of the composite SaaS is optimal. From the computational point of view, the composite SaaS placement problem is a large-scale combinatorial optimization problem. Thus, an Iterative Cooperative Co-evolutionary Genetic Algorithm (ICCGA) was proposed. The ICCGA can find reasonable quality of solutions. However, its computation time is noticeably slow. Aiming at improving the computation time, we propose an unsynchronized Parallel Cooperative Co-evolutionary Genetic Algorithm (PCCGA) in this paper. Experimental results have shown that the PCCGA not only has quicker computation time, but also generates better quality of solutions than the ICCGA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to empirically examine the state of cloud computing adoption in Australia. I specifically focus on the drivers, risks, and benefits of cloud computing from the perspective of IT experts and forensic accountants. I use thematic analysis of interview data to answer the research questions of the study. The findings suggest that cloud computing is increasingly gaining foothold in many sectors due to its advantages such as flexibility and the speed of deployment. However, security remains an issue and therefore its adoption is likely to be selective and phased. Of particular concern are the involvement of third parties and foreign jurisdictions, which in the event of damage may complicate litigation and forensic investigations. This is one of the first empirical studies that reports on cloud computing adoption and experiences in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1 PDF document (8 pp., English).-- Contributed to: VSMM'08: 14th International Conference on Virtual Systems and Multimedia (Limassol, Cyprus, Oct 20-25, 2008)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing penetration rate of feature rich mobile devices such as smartphones and tablets in the global population has resulted in a large number of applications and services being created or modified to support mobile devices. Mobile cloud computing is a proposed paradigm to address the resource scarcity of mobile devices in the face of demand for more computing intensive tasks. Several approaches have been proposed to confront the challenges of mobile cloud computing, but none has used the user experience as the primary focus point. In this paper we evaluate these approaches in respect of the user experience, propose what future research directions in this area require to provide for this crucial aspect, and introduce our own solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present Westerbork Synthesis Radio Telescope HI images, Lovell telescope multibeam H I wide-field mapping, William Herschel Telescope long-slit echelle Ca II observations, Wisconsin Halpha Mapper (WHAM) facility images, and IRAS ISSA 60- and 100-mum co-added images towards the intermediate- velocity cloud (IVC) at + 70 km s(-1), located in the general direction of the M15 globular cluster. When combined with previously published Arecibo data, the H I gas in the IVC is found to be clumpy, with a peak H I column density of similar to1.5 x 10(20) cm(-2), inferred volume density (assuming spherical symmetry) of similar to24 cm(-3)/D (kpc) and a maximum brightness temperature at a resolution of 81 x 14 arcsec(2) of 14 K. The major axis of this part of the IVC lies approximately parallel to the Galactic plane, as does the low- velocity H I gas and IRAS emission. The H I gas in the cloud is warm, with a minimum value of the full width at half-maximum velocity width of 5 km s(-1) corresponding to a kinetic temperature, in the absence of turbulence, of similar to540 K. From the H I data, there are indications of two-component velocity structure. Similarly, the Ca II spectra, of resolution 7 km s(-1), also show tentative evidence of velocity structure, perhaps indicative of cloudlets. Assuming that there are no unresolved narrow-velocity components, the mean values of log(10)[N(Ca II K) cm(2)] similar to 12.0 and Ca II/H I similar to2 5 x 10(-8) are typical of observations of high Galactic latitude clouds. This compares with a value of Ca II/H I>10(-6) for IVC absorption towards HD 203664, a halo star of distance 3 kpc, some 3.degrees1 from the main M15 IVC condensation. The main IVC condensation is detected by WHAM in Halpha with central local-standard-of-rest velocities of similar to60-70 km s(-1), and intensities uncorrected for Galactic extinction of up to 1.3 R, indicating that the gas is partially ionized. The FWHM values of the Halpha IVC component, at a resolution of 1degrees, exceed 30 km s(-1). This is some 10 km s(-1) larger than the corresponding H I value at a similar resolution, and indicates that the two components may not be mixed. However, the spatial and velocity coincidence of the Halpha and H I peaks in emission towards the main IVC component is qualitatively good. If the Halpha emission is caused solely by photoionization, the Lyman continuum flux towards the main IVC condensation is similar to2.7 x 10(6) photon cm(-2) s(-1). There is not a corresponding IVC Halpha detection towards the halo star HD 203664 at velocities exceeding similar to60 km s(- 1). Finally, both the 60- and 100-mum IRAS images show spatial coincidence, over a 0.675 x 0 625 deg(2) field, with both low- and intermediate-velocity H I gas (previously observed with the Arecibo telescope), indicating that the IVC may contain dust. Both the Halpha and tentative IRAS detections discriminate this IVC from high-velocity clouds, although the H I properties do not. When combined with the H I and optical results, these data point to a Galactic origin for at least parts of this IVC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly infrastructure providers are supplying the cloud marketplace with storage and on-demand compute resources to host cloud applications. From an application user's point of view, it is desirable to identify the most appropriate set of available resources on which to execute an application. Resource choice can be complex and may involve comparing available hardware specifications, operating systems, value-added services, such as network configuration or data replication, and operating costs, such as hosting cost and data throughput. Providers' cost models often change and new commodity cost models, such as spot pricing, have been introduced to offer significant savings. In this paper, a software abstraction layer is used to discover infrastructure resources for a particular application, across multiple providers, by using a two-phase constraints-based approach. In the first phase, a set of possible infrastructure resources are identified for a given application. In the second phase, a heuristic is used to select the most appropriate resources from the initial set. For some applications a cost-based heuristic is most appropriate; for others a performance-based heuristic may be used. A financial services application and a high performance computing application are used to illustrate the execution of the proposed resource discovery mechanism. The experimental result shows the proposed model could dynamically select an appropriate set of resouces that match the application's requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification, tracking, and statistical analysis of tropical convective complexes using satellite imagery is explored in the context of identifying feature points suitable for tracking. The feature points are determined based on the shape of complexes using the distance transform technique. This approach has been applied to the determination feature points for tropical convective complexes identified in a time series of global cloud imagery. The feature points are used to track the complexes, and from the tracks statistical diagnostic fields are computed. This approach allows the nature and distribution of organized deep convection in the Tropics to be explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic clouds (MCs) are a subset of interplanetary coronal mass ejections (ICMEs) which exhibit signatures consistent with a magnetic flux rope structure. Techniques for reconstructing flux rope orientation from single-point in situ observations typically assume the flux rope is locally cylindrical, e.g., minimum variance analysis (MVA) and force-free flux rope (FFFR) fitting. In this study, we outline a non-cylindrical magnetic flux rope model, in which the flux rope radius and axial curvature can both vary along the length of the axis. This model is not necessarily intended to represent the global structure of MCs, but it can be used to quantify the error in MC reconstruction resulting from the cylindrical approximation. When the local flux rope axis is approximately perpendicular to the heliocentric radial direction, which is also the effective spacecraft trajectory through a magnetic cloud, the error in using cylindrical reconstruction methods is relatively small (≈ 10∘). However, as the local axis orientation becomes increasingly aligned with the radial direction, the spacecraft trajectory may pass close to the axis at two separate locations. This results in a magnetic field time series which deviates significantly from encounters with a force-free flux rope, and consequently the error in the axis orientation derived from cylindrical reconstructions can be as much as 90∘. Such two-axis encounters can result in an apparent ‘double flux rope’ signature in the magnetic field time series, sometimes observed in spacecraft data. Analysing each axis encounter independently produces reasonably accurate axis orientations with MVA, but larger errors with FFFR fitting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Galactic Cosmic Rays are one of the major sources of ion production in the troposphere and stratosphere. Recent studies have shown that ions form electrically charged clusters which may grow to become cloud droplets. Aerosol particles charge by the attachment of ions and electrons. The collision efficiency between a particle and a water droplet increases, if the particle is electrically charged, and thus aerosol-cloud interactions can be enhanced. Because these microphysical processes may change radiative properties of cloud and impact Earth's climate it is important to evaluate these processes' quantitative effects. Five different models developed independently have been coupled to investigate this. The first model estimates cloud height from dew point temperature and the temperature profile. The second model simulates the cloud droplet growth from aerosol particles using the cloud parcel concept. In the third model, the scavenging rate of the aerosol particles is calculated using the collision efficiency between charged particles and droplets. The fourth model calculates electric field and charge distribution on water droplets and aerosols within cloud. The fifth model simulates the global electric circuit (GEC), which computes the conductivity and ionic concentration in the atmosphere in altitude range 0–45 km. The first four models are initially coupled to calculate the height of cloud, boundary condition of cloud, followed by growth of droplets, charge distribution calculation on aerosols and cloud droplets and finally scavenging. These models are incorporated with the GEC model. The simulations are verified with experimental data of charged aerosol for various altitudes. Our calculations showed an effect of aerosol charging on the CCN concentration within the cloud, due to charging of aerosols increase the scavenging of particles in the size range 0.1 µm to 1 µm.