943 resultados para Methods of Compression
Resumo:
Traditional methods for studying the magnetic shape memory (MSM) alloys Ni-Mn-Ga include subjecting the entire sample to a uniform magnetic field or completely actuating the sample mechanically. These methods have produced significant results in characterizing the MSM effect, the properties of Ni-Mn-Ga and have pioneered the development of applications from this material. Twin boundaries and their configuration within a Ni-Mn-Ga sample are a key component in the magnetic shape memory effect. Applications that are developed require an understanding of twin boundary characteristics and, more importantly, the ability to predictably control them. Twins have such a critical role that the twinning stress of a Ni-Mn-Ga crystal is the defining characteristic that indicates its quality and significant research has been conducted to minimize this property. This dissertation reports a decrease in the twinning stress, predictably controlling the twin configuration and characterizing the dynamics of twin boundaries. A reduction of the twinning stress is demonstrated by the discovery of Type II twins within Ni-Mn-Ga which have as little as 10% of the twinning stress of traditional Type I twins. Furthermore, new methods of actuating a Ni-Mn-Ga element using localized unidirectional or bidirectional magnetic fields were developed that can predictably control the twin configuration in a localized area of a Ni-Mn-Ga element. This method of controlling the local twin configuration was used in the characterization of twin boundary dynamics. Using a localized magnetic pulse, the velocity and acceleration of a single twin boundary were measured to be 82.5 m/s and 2.9 × 107 m/s2, and the time needed for the twin boundary to nucleate and begin moving was less than 2.8 μs. Using a bidirectional magnetic field from a diametrically magnetized cylindrical magnet, a highly reproducible and controllable local twin configuration was created in a Ni-Mn-Ga element which is the fundamental pumping mechanism in the MSM micropump that has been co-invented and extensively characterized by the author.
Resumo:
Cubiu (Solanum sessiliflorum Dunal) is an Amazonian Basin native fruit. Its importance comes from its high contents of pectin. Currently, processing technologies are necessary for the substitution of the traditional system (small crops and small-scale processing) for a larger scale system and thus increase the use of biodiversity and promote the implementation of Local Productive Arrangements of agribusiness in the Amazon. This research aims to evaluate the methods of peeling cubiu. Ripe fruits were divided into lots (150 each) and subjected to the following treatments: immersion in 2.5% NaOH boiling solution for 5 minutes, exposure to water vapor, and immersion in water at 96 ºC for 5, 10, 15 and 20 minutes. The peel released during heat treatment and immediately removed under running tap water. In the control treatment, the fruits were manually peeled (unheated) with a stainless steel knife. The treatments were evaluated for completeness and ease of peeling, tissue integrity, texture, and peroxidase activity. The immersion in 2.5% NaOH boiling solution (5 minutes) stood out as the best treatment since it inhibited the enzymatic browning and intensified the natural yellow color of the cubiu fruit and easily and fully peeled the whole fruit more rapidly without damaging its tissues. This treatment was chosen as the most advantageous because it can promote simultaneous peeling and bleaching. Therefore, it is recommended for cubiu industrial processing.
Resumo:
Decaffeinated coffee accounts for 10 percent of coffee sales in the world; it is preferred by consumers that do not wish or are sensitive to caffeine effects. This article presents an analytical comparison of capillary electrophoresis (CE) and high performance liquid chromatography (HPLC) methods for residual caffeine quantification in decaffeinated coffee in terms of validation parameters, costs, analysis time, composition and treatment of the residues generated, and caffeine quantification in 20 commercial samples. Both methods showed suitable validation parameters. Caffeine content did not differ statistically in the two different methods of analysis. The main advantage of the high performance liquid chromatography (HPLC) method was the 42-fold lower detection limit. Nevertheless, the capillary electrophoresis (CE) detection limit was 115-fold lower than the allowable limit by the Brazilian law. The capillary electrophoresis (CE) analyses were 30% faster, the reagent costs were 76.5-fold, and the volume of the residues generated was 33-fold lower. Therefore, the capillary electrophoresis (CE) method proved to be a valuable analytical tool for this type of analysis.
Resumo:
Methods of measuring specific heats of small samples were studied. Three automated methods were explored, two of which have shown promising results. The adiabatic continuous heating method, has provided smooth well behaved data but further work is presently underway to improve on the results obtained so far . The decay method has been success fully implemented demonstrating reasonable agreement with accepted data for a copper test sample.
Resumo:
The thesis explores the area of still image compression. The image compression techniques can be broadly classified into lossless and lossy compression. The most common lossy compression techniques are based on Transform coding, Vector Quantization and Fractals. Transform coding is the simplest of the above and generally employs reversible transforms like, DCT, DWT, etc. Mapped Real Transform (MRT) is an evolving integer transform, based on real additions alone. The present research work aims at developing new image compression techniques based on MRT. Most of the transform coding techniques employ fixed block size image segmentation, usually 8×8. Hence, a fixed block size transform coding is implemented using MRT and the merits and demerits are analyzed for both 8×8 and 4×4 blocks. The N2 unique MRT coefficients, for each block, are computed using templates. Considering the merits and demerits of fixed block size transform coding techniques, a hybrid form of these techniques is implemented to improve the performance of compression. The performance of the hybrid coder is found to be better compared to the fixed block size coders. Thus, if the block size is made adaptive, the performance can be further improved. In adaptive block size coding, the block size may vary from the size of the image to 2×2. Hence, the computation of MRT using templates is impractical due to memory requirements. So, an adaptive transform coder based on Unique MRT (UMRT), a compact form of MRT, is implemented to get better performance in terms of PSNR and HVS The suitability of MRT in vector quantization of images is then experimented. The UMRT based Classified Vector Quantization (CVQ) is implemented subsequently. The edges in the images are identified and classified by employing a UMRT based criteria. Based on the above experiments, a new technique named “MRT based Adaptive Transform Coder with Classified Vector Quantization (MATC-CVQ)”is developed. Its performance is evaluated and compared against existing techniques. A comparison with standard JPEG & the well-known Shapiro’s Embedded Zero-tree Wavelet (EZW) is done and found that the proposed technique gives better performance for majority of images
Resumo:
The traditional control of Imperata brasiliensis grasslands used by farmers in the Peruvian Amazon is to burn the grass. The objective of this study was to compare different methods of short-term control. Biological, mechanical, chemical and traditional methods of control were compared. Herbicide spraying and manual weeding have shown to be very effective in reducing above- and below-ground biomass growth in the first 45 days after slashing the grass, with effects persisting in the longer term, but both are expensive methods. Shading seems to be less effective in the short-term, whereas it influences the Imperata growth in the longer term. After one year shading, glyphosate application and weeding significantly reduced aboveground biomass by 94, 67 and 53%; and belowground biomass by 76, 65 and 58%, respectively, compared to control. We also found a significant decrease of Imperata rhizomes in soil during time under shading. Burning has proved to have no significant effect on Imperata growth. The use of shade trees in a kind of agroforestry system could be a suitable method for small farmers to control Imperata grasslands.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing one or more parameters in their definition. Methods that can be linked in this way are correspondence analysis, unweighted or weighted logratio analysis (the latter also known as "spectral mapping"), nonsymmetric correspondence analysis, principal component analysis (with and without logarithmic transformation of the data) and multidimensional scaling. In this presentation I will show how several of these methods, which are frequently used in compositional data analysis, may be linked through parametrizations such as power transformations, linear transformations and convex linear combinations. Since the methods of interest here all lead to visual maps of data, a "movie" can be made where where the linking parameter is allowed to vary in small steps: the results are recalculated "frame by frame" and one can see the smooth change from one method to another. Several of these "movies" will be shown, giving a deeper insight into the similarities and differences between these methods
Resumo:
Two common methods of accounting for electric-field-induced perturbations to molecular vibration are analyzed and compared. The first method is based on a perturbation-theoretic treatment and the second on a finite-field treatment. The relationship between the two, which is not immediately apparent, is made by developing an algebraic formalism for the latter. Some of the higher-order terms in this development are documented here for the first time. As well as considering vibrational dipole polarizabilities and hyperpolarizabilities, we also make mention of the vibrational Stark effec
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
Background: Meta-analyses based on individual patient data (IPD) are regarded as the gold standard for systematic reviews. However, the methods used for analysing and presenting results from IPD meta-analyses have received little discussion. Methods We review 44 IPD meta-analyses published during the years 1999–2001. We summarize whether they obtained all the data they sought, what types of approaches were used in the analysis, including assumptions of common or random effects, and how they examined the effects of covariates. Results: Twenty-four out of 44 analyses focused on time-to-event outcomes, and most analyses (28) estimated treatment effects within each trial and then combined the results assuming a common treatment effect across trials. Three analyses failed to stratify by trial, analysing the data is if they came from a single mega-trial. Only nine analyses used random effects methods. Covariate-treatment interactions were generally investigated by subgrouping patients. Seven of the meta-analyses included data from less than 80% of the randomized patients sought, but did not address the resulting potential biases. Conclusions: Although IPD meta-analyses have many advantages in assessing the effects of health care, there are several aspects that could be further developed to make fuller use of the potential of these time-consuming projects. In particular, IPD could be used to more fully investigate the influence of covariates on heterogeneity of treatment effects, both within and between trials. The impact of heterogeneity, or use of random effects, are seldom discussed. There is thus considerable scope for enhancing the methods of analysis and presentation of IPD meta-analysis.
Resumo:
The commercial process in construction projects is an expensive and highly variable overhead. Collaborative working practices carry many benefits, which are widely disseminated, but little information is available about their costs. Transaction Cost Economics is a theoretical framework that seeks explanations for why there are firms and how the boundaries of firms are defined through the “make-or-buy” decision. However, it is not a framework that offers explanations for the relative costs of procuring construction projects in different ways. The idea that different methods of procurement will have characteristically different costs is tested by way of a survey. The relevance of transaction cost economics to the study of commercial costs in procurement is doubtful. The survey shows that collaborative working methods cost neither more nor less than traditional methods. But the benefits of collaboration mean that there is a great deal of enthusiasm for collaboration rather than competition.
Resumo:
The assessment of cellular effects by the aqueous phase of human feces (fecal water, FW) is a useful biomarker approach to study cancer risks and protective activities of food. In order to refine and develop the biomarker, different protocols of preparing FW were compared. Fecal waters were prepared by 3 methods: (A) direct centrifugation; (B) extraction of feces in PBS before centrifugation; and (C) centrifugation of lyophilized and reconstituted feces. Genotoxicity was determined in colon cells using the Comet assay. Selected samples were investigated for additional parameters related to carcinogenesis. Two of 7 FWs obtained by methods A and B were similarly genotoxic. Method B, however, yielded higher volumes of FW, allowing sterile filtration for long-term culture experiments. Four of 7 samples were non-genotoxic when prepared according to all 3 methods. FW from lyophilized feces and from fresh samples were equally genotoxic. FWs modulated cytotoxicity, paracellular permeability, and invasion, independent of their genotoxicity. All 3 methods of FW preparation can be used to assess genotoxicity. The higher volumes of FWobtained by preparation method B greatly enhance the perspectives of measuring different types of biological parameters and using these to disclose activities related to cancer development.
Resumo:
Physiological evidence using Infrared Video Microscopy during the uncaging of glutamate has proven the existence of excitable calcium ion channels in spine heads, highlighting the need for reliable models of spines. In this study we compare the three main methods of simulating excitable spines: Baer & Rinzel's Continuum (B&R) model, Coombes' Spike-Diffuse-Spike (SDS) model and paired cable and ion channel equations (Cable model). Tests are done to determine how well the models approximate each other in terms of speed and heights of travelling waves. Significant quantitative differences are found between the models: travelling waves in the SDS model in particular are found to travel at much lower speeds and sometimes much higher voltages than in the Cable or B&R models. Meanwhile qualitative differences are found between the B&R and SDS models over realistic parameter ranges. The cause of these differences is investigated and potential solutions proposed.
Resumo:
We discuss several methods of calculating the DIS structure functions F2(x,Q2) based on BFKL-type small x resummations. Taking into account new HERA data ranging down to small xand low Q2, the pure leading order BFKL-based approach is excluded. Other methods based on high energy factorization are closer to conventional renormalization group equations. Despite several difficulties and ambiguities in combining the renormalization group equations with small x resummed terms, we find that a fit to the current data is hardly feasible, since the data in the low Q2 region are not as steep as the BFKL formalism predicts. Thus we conclude that deviations from the (successful) renormalization group approach towards summing up logarithms in 1/x are disfavoured by experiment.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.