948 resultados para soft computing methods
Resumo:
One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.
Resumo:
Methods for reliable evaluation of spinal cord (SC) injury in rats at short periods (2 and 24 h) after lesion were tested to characterize the mechanisms implicated in primary SC damage. We measured the physiological changes occurring after several procedures for producing SC injury, with particular emphasis on sensorimotor functions. Segmental and suprasegmental reflexes were tested in 39 male Wistar rats weighing 250-300 g divided into three control groups that were subjected to a) anesthesia, b) dissection of soft prevertebral tissue, and c) laminectomy of the vertebral segments between T10 and L1. In the lesion group the SC was completely transected, hemisected or subjected to vertebral compression. All animals were evaluated 2 and 24 h after the experimental procedure by the hind limb motility index, Bohlman motor score, open-field, hot-plate, tail flick, and paw compression tests. The locomotion scale proved to be less sensitive than the sensorimotor tests. A reduction in exploratory movements was detected in the animals 24 h after the procedures. The hot-plate was the most sensitive test for detecting sensorimotor deficiencies following light, moderate or severe SC injury. The most sensitive and simplest test of reflex function was the hot-plate. The hemisection model promoted reproducible moderate SC injury which allowed us to quantify the resulting behavior and analyze the evolution of the lesion and its consequences during the first 24 h after injury. We conclude that hemisection permitted the quantitation of behavioral responses for evaluation of the development of deficits after lesions. Hind limb evaluation scores and spontaneous exploration events provided a sensitive index of immediate injury effects after SC lesion at 2 and 24 h. Taken together, locomotion scales, open-field, and hot-plate tests represent reproducible, quantitatively sensitive methods for detecting functional deficiencies within short periods of time, indicating their potential for the study of cellular mechanisms of primary injury and repair after traumatic SC injury.
Resumo:
Rice cooking quality is usually evaluated by texture and stickiness characteristics using many different methods. Gelatinization temperature, amylose content, viscosity (Brookfield viscometer and Rapid Visco Analyzer), and sensory analysis were performed to characterize culinary quality of rice grains produced under two cropping systems and submitted to different technologies. All samples from the upland cropping system and two from the irrigated cropping system presented intermediate amylose content. Regarding stickiness, BRS Primavera, BRS Sertaneja, and BRS Tropical showed loose cooked grains. Irrigated cultivars presented less viscosity and were softer than upland cultivars. Upland grain samples had similar profile on the viscoamylografic curve, but the highest viscosity peaks were observed for BRS Alvorada, IRGA 417, and SCS BRS Piracema among the irrigated cropping system samples. In general, distinct grain characteristics were observed between upland and irrigated samples by cluster analysis. The majority of the upland cultivars showed soft and loose grains with adequate cooking quality confirmed by sensory tests. Most of the irrigated cultivars, however, presented soft and sticky grains. Different methodologies allowed to improve the construction of the culinary profile of the varieties studied.
Resumo:
This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.
Resumo:
Variations in different types of genomes have been found to be responsible for a large degree of physical diversity such as appearance and susceptibility to disease. Identification of genomic variations is difficult and can be facilitated through computational analysis of DNA sequences. Newly available technologies are able to sequence billions of DNA base pairs relatively quickly. These sequences can be used to identify variations within their specific genome but must be mapped to a reference sequence first. In order to align these sequences to a reference sequence, we require mapping algorithms that make use of approximate string matching and string indexing methods. To date, few mapping algorithms have been tailored to handle the massive amounts of output generated by newly available sequencing technologies. In otrder to handle this large amount of data, we modified the popular mapping software BWA to run in parallel using OpenMPI. Parallel BWA matches the efficiency of multithreaded BWA functions while providing efficient parallelism for BWA functions that do not currently support multithreading. Parallel BWA shows significant wall time speedup in comparison to multithreaded BWA on high-performance computing clusters, and will thus facilitate the analysis of genome sequencing data.
Resumo:
Le développement accéléré des technologies de communication, de saisie et de traitement de l’information durant les dernières années décennies ouvre la voie à de nouveaux moyens de contrôle social. Selon l’auteur Gary Marx ceux-ci sont de nature non coercitive et permettent à des acteurs privés ou publics d’obtenir des informations personnelles sur des individus sans que ceux-ci y consentent ou mêmes sans qu’ils en soient conscients. Ces moyens de contrôle social se fondent sur certaines valeurs sociales qui sont susceptibles de modifier le comportement des individus comme le patriotisme, la notion de bon citoyen ou le volontarisme. Tout comme les moyens coercitifs, elles amènent les individus à adopter certains comportements et à divulguer des informations précises. Toutefois, ces moyens se fondent soit sur le consentement des individus, consentement qui est souvent factice et imposée, soit l’absence de connaissance du processus de contrôle par les individus. Ainsi, l’auteur illustre comment des organisations privées et publiques obtiennent des informations privilégiées sur la population sans que celle-ci en soit réellement consciente. Les partisans de tels moyens soulignent leur importance pour la sécurité et le bien publique. Le discours qui justifie leur utilisation soutient qu’ils constituent des limites nécessaires et acceptables aux droits individuels. L’emploi de telles méthodes est justifié par le concept de l’intérêt public tout en minimisant leur impact sur les droits des individus. Ainsi, ces méthodes sont plus facilement acceptées et moins susceptibles d’être contestées. Toutefois, l’auteur souligne l’importance de reconnaître qu’une méthode de contrôle empiète toujours sur les droits des individus. Ces moyens de contrôle sont progressivement intégrés à la culture et aux modes de comportement. En conséquence, ils sont plus facilement justifiables et certains groupes en font même la promotion. Cette réalité rend encore plus difficile leur encadrement afin de protéger les droits individuels. L’auteur conclut en soulignant l’important décalage moral derrière l’emploi de ces méthodes non-coercitives de contrôle social et soutient que seul le consentement éclairé des individus peut justifier leur utilisation. À ce sujet, il fait certaines propositions afin d’encadrer et de rendre plus transparente l’utilisation de ces moyens de contrôle social.
Resumo:
Trawling, though an efficient method of fishing, is known to be one of the most non-selective methods of fish capture. The bulk of the wild caught penaeid shrimps landed in India are caught by trawling.In addition to shrimps, the trawler fleet also catches considerable amount of non-shrimp resources. The term bycatch means that portion of the catch other than target species caught while fishing, which are either retained or discarded. Bycatch discards is a serious problem leading to the depletion of the resources and negative impacts on biodiversity. In order to minimize this problem, trawling has to be made more selective by incorporating Bycatch Reduction Devices (BRDs). There are several advantages in using BRDs in shrimp trawling. BRDs reduce the negative impacts of shrimp trawling on marine community. Fishers could benefit economically from higher catch value due to improved catch quality, shorter sorting time, lower fuel costs, and longer tow duration. Adoption of BRDs by fishers would forestall criticism by conservation groups against trawling.
Resumo:
Speech signals are one of the most important means of communication among the human beings. In this paper, a comparative study of two feature extraction techniques are carried out for recognizing speaker independent spoken isolated words. First one is a hybrid approach with Linear Predictive Coding (LPC) and Artificial Neural Networks (ANN) and the second method uses a combination of Wavelet Packet Decomposition (WPD) and Artificial Neural Networks. Voice signals are sampled directly from the microphone and then they are processed using these two techniques for extracting the features. Words from Malayalam, one of the four major Dravidian languages of southern India are chosen for recognition. Training, testing and pattern recognition are performed using Artificial Neural Networks. Back propagation method is used to train the ANN. The proposed method is implemented for 50 speakers uttering 20 isolated words each. Both the methods produce good recognition accuracy. But Wavelet Packet Decomposition is found to be more suitable for recognizing speech because of its multi-resolution characteristics and efficient time frequency localizations
Resumo:
Two lectures that introduce the idea of modelling in the large, and contrasts hard system and soft system modelling. The second lecture goes into detail on a number of specific methods for analysing a system (CATWOE and CSH) and on modelling a system (Systems Diagrams and Personas).
Resumo:
In this paper we consider the scattering of a plane acoustic or electromagnetic wave by a one-dimensional, periodic rough surface. We restrict the discussion to the case when the boundary is sound soft in the acoustic case, perfectly reflecting with TE polarization in the EM case, so that the total field vanishes on the boundary. We propose a uniquely solvable first kind integral equation formulation of the problem, which amounts to a requirement that the normal derivative of the Green's representation formula for the total field vanish on a horizontal line below the scattering surface. We then discuss the numerical solution by Galerkin's method of this (ill-posed) integral equation. We point out that, with two particular choices of the trial and test spaces, we recover the so-called SC (spectral-coordinate) and SS (spectral-spectral) numerical schemes of DeSanto et al., Waves Random Media, 8, 315-414 1998. We next propose a new Galerkin scheme, a modification of the SS method that we term the SS* method, which is an instance of the well-known dual least squares Galerkin method. We show that the SS* method is always well-defined and is optimally convergent as the size of the approximation space increases. Moreover, we make a connection with the classical least squares method, in which the coefficients in the Rayleigh expansion of the solution are determined by enforcing the boundary condition in a least squares sense, pointing out that the linear system to be solved in the SS* method is identical to that in the least squares method. Using this connection we show that (reflecting the ill-posed nature of the integral equation solved) the condition number of the linear system in the SS* and least squares methods approaches infinity as the approximation space increases in size. We also provide theoretical error bounds on the condition number and on the errors induced in the numerical solution computed as a result of ill-conditioning. Numerical results confirm the convergence of the SS* method and illustrate the ill-conditioning that arises.
Resumo:
Ellipsometry and atomic force microscopy (AFM) were used to study the film thickness and the surface roughness of both 'soft' and solid thin films. 'Soft' polymer thin films of polystyrene and poly(styrene-ethylene/butylene-styrene) block copolymer were prepared by spin-coating onto planar silicon wafers. Ellipsometric parameters were fitted by the Cauchy approach using a two-layer model with planar boundaries between the layers. The smooth surfaces of the prepared polymer films were confirmed by AFM. There is good agreement between AFM and ellipsometry in the 80-130 nm thickness range. Semiconductor surfaces (Si) obtained by anisotropic chemical etching were investigated as an example of a randomly rough surface. To define roughness parameters by ellipsometry, the top rough layers were treated as thin films according to the Bruggeman effective medium approximation (BEMA). Surface roughness values measured by AFM and ellipsometry show the same tendency of increasing roughness with increased etching time, although AFM results depend on the used window size. The combined use of both methods appears to offer the most comprehensive route to quantitative surface roughness characterisation of solid films. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
The General Packet Radio Service (GPRS) has been developed for the mobile radio environment to allow the migration from the traditional circuit switched connection to a more efficient packet based communication link particularly for data transfer. GPRS requires the addition of not only the GPRS software protocol stack, but also more baseband functionality for the mobile as new coding schemes have be en defined, uplink status flag detection, multislot operation and dynamic coding scheme detect. This paper concentrates on evaluating the performance of the GPRS coding scheme detection methods in the presence of a multipath fading channel with a single co-channel interferer as a function of various soft-bit data widths. It has been found that compressing the soft-bit data widths from the output of the equalizer to save memory can influence the likelihood decision of the coding scheme detect function and hence contribute to the overall performance loss of the system. Coding scheme detection errors can therefore force the channel decoder to either select the incorrect decoding scheme or have no clear decision which coding scheme to use resulting in the decoded radio block failing the block check sequence and contribute to the block error rate. For correct performance simulation, the performance of the full coding scheme detection must be taken into account.
Resumo:
Dual Carrier Modulation (DCM) was chosen as the higher data rate modulation scheme for MB-OFDM (Multiband Orthogonal Frequency Division Multiplexing) in the UWB (Ultra-Wide Band) radio platform ECMA-368. ECMA-368 has been chosen as the physical implementation for high data rate Wireless USB (W-USB) and Bluetooth 3.0. In this paper, different demapping methods for the DCM demapper are presented, being Soft Bit, Maximum Likely (ML) Soft Bit and Log Likelihood Ratio (LLR). Frequency diversity and Channel State Information (CSI) are further techniques to enhance demapping methods. The system performance for those DCM demapping methods simulated in realistic multi-path environments are provided and compared.
Resumo:
Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.