969 resultados para super-aggregates
On Implementing Joins, Aggregates and Universal Quantifier in Temporal Databases using SQL Standards
Resumo:
A feasible way of implementing a temporal database is by mapping temporal data model onto a conventional data model followed by a commercial database management system. Even though extensions were proposed to standard SQL for supporting temporal databases, such proposals have not yet come across standardization processes. This paper attempts to implement database operators such as aggregates and universal quantifier for temporal databases, implemented on top of relational database systems, using currently available SQL standards.
Resumo:
Elastic properties of sodium doped Lithium potassium sulphate, LiK0.9Na0.1SO4, crystal has been studied by ultrasonic Pulse Echo Overlap [PEO] technique and are reported for the first time. The controversy regarding the type of crystal found while growth is performed at 35 °C with equimolar fraction of Li2SO4H2O, K2SO4 and Na2SO4 has been resolved by studying the elastic properties. The importance of this crystal is that it exhibits pyroelectric, ferroelectric and electro optic properties. It is simultaneously ferroelastic and superionic. The elastic properties of LiK0.9Na0.1SO4 crystal are well studied by measuring ultrasonic velocity in the crystal in certain specified crystallographic directions and evaluating the elastic stiffness constants, compliance constants and Poisson’s ratios. The anisotropy in the elastic properties of the crystal are well explained by the pictorial representation of the surface plots of phase velocity, slowness and linear compressibility in a-b and a-c planes.
Resumo:
Aggregates of oxygen vacancies (F centers) represent a particular form of point defects in ionic crystals. In this study we have considered the combination of two oxygen vacancies, the M center, in the bulk and on the surface of MgO by means of cluster model calculations. Both neutral and charged forms of the defect M and M+ have been taken into account. The ground state of the M center is characterized by the presence of two doubly occupied impurity levels in the gap of the material; in M+ centers the highest level is singly occupied. For the ground-state properties we used a gradient corrected density functional theory approach. The dipole-allowed singlet-to-singlet and doublet-to-doublet electronic transitions have been determined by means of explicitly correlated multireference second-order perturbation theory calculations. These have been compared with optical transitions determined with the time-dependent density functional theory formalism. The results show that bulk M and M+ centers give rise to intense absorptions at about 4.4 and 4.0 eV, respectively. Another less intense transition at 1.3 eV has also been found for the M+ center. On the surface the transitions occur at 1.6 eV (M+) and 2 eV (M). The results are compared with recently reported electron energy loss spectroscopy spectra on MgO thin films.
Resumo:
Oceans play a vital role in the global climate system. They absorb the incoming solar energy and redistribute the energy through horizontal and vertical transports. In this context it is important to investigate the variation of heat budget components during the formation of a low-pressure system. In 2007, the monsoon onset was on 28th May. A well- marked low-pressure area was formed in the eastern Arabian Sea after the onset and it further developed into a cyclone. We have analysed the heat budget components during different stages of the cyclone. The data used for the computation of heat budget components is Objectively Analyzed air-sea flux data obtained from WHOI (Woods Hole Oceanographic Institution) project. Its horizontal resolution is 1° × 1°. Over the low-pressure area, the latent heat flux was 180 Wm−2. It increased to a maximum value of 210 Wm−2 on 1st June 2007, on which the system was intensified into a cyclone (Gonu) with latent heat flux values ranging from 200 to 250 Wm−2. It sharply decreased after the passage of cyclone. The high value of latent heat flux is attributed to the latent heat release due to the cyclone by the formation of clouds. Long wave radiation flux is decreased sharply from 100 Wm−2 to 30 Wm−2 when the low-pressure system intensified into a cyclone. The decrease in long wave radiation flux is due to the presence of clouds. Net heat flux also decreases sharply to −200 Wm−2 on 1st June 2007. After the passage, the flux value increased to normal value (150 Wm−2) within one day. A sharp increase in the sensible heat flux value (20 Wm−2) is observed on 1st June 2007 and it decreased there- after. Short wave radiation flux decreased from 300 Wm−2 to 90 Wm−2 during the intensification on 1st June 2007. Over this region, short wave radiation flux sharply increased to higher value soon after the passage of the cyclone.
Resumo:
An improved color video super-resolution technique using kernel regression and fuzzy enhancement is presented in this paper. A high resolution frame is computed from a set of low resolution video frames by kernel regression using an adaptive Gaussian kernel. A fuzzy smoothing filter is proposed to enhance the regression output. The proposed technique is a low cost software solution to resolution enhancement of color video in multimedia applications. The performance of the proposed technique is evaluated using several color videos and it is found to be better than other techniques in producing high quality high resolution color videos
Resumo:
In this paper, a new directionally adaptive, learning based, single image super resolution method using multiple direction wavelet transform, called Directionlets is presented. This method uses directionlets to effectively capture directional features and to extract edge information along different directions of a set of available high resolution images .This information is used as the training set for super resolving a low resolution input image and the Directionlet coefficients at finer scales of its high-resolution image are learned locally from this training set and the inverse Directionlet transform recovers the super-resolved high resolution image. The simulation results showed that the proposed approach outperforms standard interpolation techniques like Cubic spline interpolation as well as standard Wavelet-based learning, both visually and in terms of the mean squared error (mse) values. This method gives good result with aliased images also.
Resumo:
Super Resolution problem is an inverse problem and refers to the process of producing a High resolution (HR) image, making use of one or more Low Resolution (LR) observations. It includes up sampling the image, thereby, increasing the maximum spatial frequency and removing degradations that arise during the image capture namely aliasing and blurring. The work presented in this thesis is based on learning based single image super-resolution. In learning based super-resolution algorithms, a training set or database of available HR images are used to construct the HR image of an image captured using a LR camera. In the training set, images are stored as patches or coefficients of feature representations like wavelet transform, DCT, etc. Single frame image super-resolution can be used in applications where database of HR images are available. The advantage of this method is that by skilfully creating a database of suitable training images, one can improve the quality of the super-resolved image. A new super resolution method based on wavelet transform is developed and it is better than conventional wavelet transform based methods and standard interpolation methods. Super-resolution techniques based on skewed anisotropic transform called directionlet transform are developed to convert a low resolution image which is of small size into a high resolution image of large size. Super-resolution algorithm not only increases the size, but also reduces the degradations occurred during the process of capturing image. This method outperforms the standard interpolation methods and the wavelet methods, both visually and in terms of SNR values. Artifacts like aliasing and ringing effects are also eliminated in this method. The super-resolution methods are implemented using, both critically sampled and over sampled directionlets. The conventional directionlet transform is computationally complex. Hence lifting scheme is used for implementation of directionlets. The new single image super-resolution method based on lifting scheme reduces computational complexity and thereby reduces computation time. The quality of the super resolved image depends on the type of wavelet basis used. A study is conducted to find the effect of different wavelets on the single image super-resolution method. Finally this new method implemented on grey images is extended to colour images and noisy images
Resumo:
As a result of the drive towards waste-poor world and reserving the non-renewable materials, recycling the construction and demolition materials become very essential. Now reuse of the recycled concrete aggregate more than 4 mm in producing new concrete is allowed but with natural sand a fine aggregate while. While the sand portion that represent about 30\% to 60\% of the crushed demolition materials is disposed off. To perform this research, recycled concrete sand was produced in the laboratory while nine recycled sands produced from construction and demolitions materials and two sands from natural crushed limestone were delivered from three plants. Ten concrete mix designs representing the concrete exposition classes XC1, XC2, XF3 and XF4 according to European standard EN 206 were produced with partial and full replacement of natural sand by the different recycled sands. Bituminous mixtures achieving the requirements of base courses according to Germany standards and both base and binder courses according to Egyptian standards were produced with the recycled sands as a substitution to the natural sands. The mechanical properties and durability of concrete produced with the different recycled sands were investigated and analyzed. Also the volumetric analysis and Marshall test were performed hot bituminous mixtures produced with the recycled sands. According to the effect of replacement the natural sand by the different recycled sands on the concrete compressive strength and durability, the recycled sands were classified into three groups. The maximum allowable recycled sand that can be used in the different concrete exposition class was determined for each group. For the asphalt concrete mixes all the investigated recycled sands can be used in mixes for base and binder courses up to 21\% of the total aggregate mass.
Resumo:
Mesh generation is an important step inmany numerical methods.We present the “HierarchicalGraphMeshing” (HGM)method as a novel approach to mesh generation, based on algebraic graph theory.The HGM method can be used to systematically construct configurations exhibiting multiple hierarchies and complex symmetry characteristics. The hierarchical description of structures provided by the HGM method can be exploited to increase the efficiency of multiscale and multigrid methods. In this paper, the HGMmethod is employed for the systematic construction of super carbon nanotubes of arbitrary order, which present a pertinent example of structurally and geometrically complex, yet highly regular, structures. The HGMalgorithm is computationally efficient and exhibits good scaling characteristics. In particular, it scales linearly for super carbon nanotube structures and is working much faster than geometry-based methods employing neighborhood search algorithms. Its modular character makes it conducive to automatization. For the generation of a mesh, the information about the geometry of the structure in a given configuration is added in a way that relates geometric symmetries to structural symmetries. The intrinsically hierarchic description of the resulting mesh greatly reduces the effort of determining mesh hierarchies for multigrid and multiscale applications and helps to exploit symmetry-related methods in the mechanical analysis of complex structures.
Resumo:
Fine-grained parallel machines have the potential for very high speed computation. To program massively-concurrent MIMD machines, programmers need tools for managing complexity. These tools should not restrict program concurrency. Concurrent Aggregates (CA) provides multiple-access data abstraction tools, Aggregates, which can be used to implement abstractions with virtually unlimited potential for concurrency. Such tools allow programmers to modularize programs without reducing concurrency. I describe the design, motivation, implementation and evaluation of Concurrent Aggregates. CA has been used to construct a number of application programs. Multi-access data abstractions are found to be useful in constructing highly concurrent programs.
Resumo:
Recently, researchers have introduced the notion of super-peers to improve signaling efficiency as well as lookup performance of peer-to-peer (P2P) systems. In a separate development, recent works on applications of mobile ad hoc networks (MANET) have seen several proposals on utilizing mobile fleets such as city buses to deploy a mobile backbone infrastructure for communication and Internet access in a metropolitan environment. This paper further explores the possibility of deploying P2P applications such as content sharing and distributed computing, over this mobile backbone infrastructure. Specifically, we study how city buses may be deployed as a mobile system of super-peers. We discuss the main motivations behind our proposal, and outline in detail the design of a super-peer based structured P2P system using a fleet of city buses.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
In this session we introduce inheritance - one of the cornerstone concepts of object oriented programming. We look at how to define super and sub-classes, how to maintain encapsulation using the super() constructor, and why it is useful to use substitution to hold references to sub-classes in references typed as their super-class.
Resumo:
El trabajo no ha sido publicado. Entre los anexos se encuentran fichas de trabajos manuales para alumnos. Resumen basado en una ficha elaborada por los autores o la responsable