23 resultados para Applied cryptography
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
P>Vegetable oils can be extracted using ethanol as solvent. The main goal of this work was to evaluate the ethanol performance on the extraction process of rice bran oil. The influence of process variables, solvent hydration and temperature was evaluated using the response surface methodology, aiming to maximise the soluble substances and gamma-oryzanol transfer and minimise the free fatty acids extraction and the liquid content in the underflow solid. It can be noted that oil solubility in ethanol was highly affected by the water content. The free fatty acids extraction is improved by increasing the moisture content in the solvent. Regarding the gamma-oryzanol, it can be observed that its extraction is affected by temperature when low level of water is added to ethanol. On the other hand, the influence of temperature is minimised with high levels of water in the ethanol.
Resumo:
Our aim was to investigate the immediate effects of bilateral, 830 nm, low-level laser therapy (LLLT) on high-intensity exercise and biochemical markers of skeletal muscle recovery, in a randomised, double-blind, placebo-controlled, crossover trial set in a sports physiotherapy clinic. Twenty male athletes (nine professional volleyball players and eleven adolescent soccer players) participated. Active LLLT (830 nm wavelength, 100 mW, spot size 0.0028 cm(2), 3-4 J per point) or an identical placebo LLLT was delivered to five points in the rectus femoris muscle (bilaterally). The main outcome measures were the work performed in the Wingate test: 30 s of maximum cycling with a load of 7.5% of body weight, and the measurement of blood lactate (BL) and creatine kinase (CK) levels before and after exercise. There was no significant difference in the work performed during the Wingate test (P > 0.05) between subjects given active LLLT and those given placebo LLLT. For volleyball athletes, the change in CK levels from before to after the exercise test was significantly lower (P = 0.0133) for those given active LLLT (2.52 U l(-1) +/- 7.04 U l(-1)) than for those given placebo LLLT (28.49 U l(-1) +/- 22.62 U l(-1)). For the soccer athletes, the change in blood lactate levels from before exercise to 15 min after exercise was significantly lower (P < 0.01) in the group subjected to active LLLT (8.55 mmol l(-1) +/- 2.14 mmol l(-1)) than in the group subjected to placebo LLLT (10.52 mmol l(-1) +/- 1.82 mmol l(-1)). LLLT irradiation before the Wingate test seemed to inhibit an expected post-exercise increase in CK level and to accelerate post-exercise lactate removal without affecting test performance. These findings suggest that LLLT may be of benefit in accelerating post-exercise recovery.
Resumo:
A large amount of biological data has been produced in the last years. Important knowledge can be extracted from these data by the use of data analysis techniques. Clustering plays an important role in data analysis, by organizing similar objects from a dataset into meaningful groups. Several clustering algorithms have been proposed in the literature. However, each algorithm has its bias, being more adequate for particular datasets. This paper presents a mathematical formulation to support the creation of consistent clusters for biological data. Moreover. it shows a clustering algorithm to solve this formulation that uses GRASP (Greedy Randomized Adaptive Search Procedure). We compared the proposed algorithm with three known other algorithms. The proposed algorithm presented the best clustering results confirmed statistically. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We describe in this article the application of a high-density gas aggregation nanoparticle gun to the production and characterization of high anisotropy SmCo nanoparticles. We give a detailed description of the simple but efficient experimental apparatus with a focus on the microscopic processes of the gas aggregation technique. Using high values of gas flux (similar to 45 sccm) we are able to operate in regimes of high collimation of material. In this regime, as we explain in terms of a phenomenological model, the power applied to the sputtering target becomes the main variable to change the size of the clusters. Also presented are the morphological, structural, and magnetic characterizations of SmCo nanoparticles produced using 10 and 50 W of sputtering power. These values resulted in mean sizes of similar to 12 and similar to 20 nm. Significant differences are seen in the structural and magnetic properties of the samples with the 50 W sample showing a largely enhanced crystalline structure and magnetic anisotropy.
Resumo:
An evaluation was made of the influence of calcination temperatures on the structure, morphology and eletromagnetic properties of Ni-Zn ferrite powders. To this end, Ni(0.5)Zn(0.5)Fe(2)O(4) ferrite powders were prepared by combustion reaction and calcined at temperatures of 800, 1000 and 1200 degrees C/2 h. The resulting powders were characterized by XRD, SEM and reflectivity measurements in the frequency bands of 8-12 GHz. The results demonstrated that raising the calcination temperature increased the particle sizes of the powders of all the systems in question, improving the reflectivity of the materials. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this work we use magnetic resonant x-ray diffraction to study the magnetic properties of a 1.5 mu m EuTe film and an EuTe/PbTe superlattice (SL). The samples were grown by molecular beam epitaxy on (111) oriented BaF(2) substrates. The measurements were made at the Eu L(2) absorption edge, taking profit of the resonant enhancement of more than two orders in the magnetically diffracted intensity. At resonance, high counting rates above 11000 cps were obtained for the 1.5 gm EuTe film, allowing to check for the type II antiferromagnetic order of EuTe. An equal population of the three possible in-plane magnetic domains was found. The EuTe/PbTe SL magnetic peak showed a satellite structure, indicating the presence of magnetic correlations among the 5 ML (monolayers) EuTe layers across the 15 ML PbTe non-magnetic spacers. The temperature dependence of the integrated intensities of the film and the SL yielded different Neel temperatures T(N). The lower T(N) for the SL is explained considering the higher influence of the surface atoms, with partial bonds lost.
Resumo:
The protective shielding design of a mammography facility requires the knowledge of the scattered radiation by the patient and image receptor components. The shape and intensity of secondary x-ray beams depend on the kVp applied to the x-ray tube, target/filter combination, primary x-ray field size, and scattering angle. Currently, shielding calculations for mammography facilities are performed based on scatter fraction data for Mo/Mo target/filter, even though modern mammography equipment is designed with different anode/filter combinations. In this work we present scatter fraction data evaluated based on the x-ray spectra produced by a Mo/Mo, Mo/Rh and W/Rh target/filter, for 25, 30 and 35 kV tube voltages and scattering angles between 30 and 165 degrees. Three mammography phantoms were irradiated and the scattered radiation was measured with a CdZnTe detector. The primary x-ray spectra were computed with a semiempirical model based on the air kerma and HVL measured with an ionization chamber. The results point out that the scatter fraction values are higher for W/Rh than for Mo/Mo and Mo/Rh, although the primary and scattered air kerma are lower for W/Rh than for Mo/Mo and Mo/Rh target/filter combinations. The scatter fractions computed in this work were applied in a shielding design calculation in order to evaluate shielding requirements for each of these target/filter combinations. Besides, shielding requirements have been evaluated converting the scattered air kerma from mGy/week to mSv/week adopting initially a conversion coefficient from air kerma to effective dose as 1 Sv/Gy and then a mean conversion coefficient specific for the x-ray beam considered. Results show that the thickest barrier should be provided for Mo/Mo target/filter combination. They also point out that the use of the conversion coefficient from air kerma to effective dose as 1 Sv/Gy is conservatively high in the mammography energy range and overestimate the barrier thickness. (c) 2008 American Association of Physicists in Medicine.
Resumo:
Phototherapy improves cellular activation which is an important factor for the treatment of cellulite. The objective of this research was to develop and evaluate the effects of a new (noninvasive and nonpharmacological) clinical procedure to improve body aesthetics: infrared-LED (850 nm) plus treadmill training. Twenty women (25-55 years old) participated in this study. They were separated in two groups: the control group, which carried out only the treadmill training (n = 10), and the LED group, with phototherapy during the treadmill training (n - 10). The training was performed for 45 minutes twice a week over 3 months at intensities between 85% and 90% maximal heart rate (HR(max)). The irradiation parameters were 39 mW/cm(2) and a fluence of 106 J/cm(2). The treatment was evaluated by interpreting body composition parameters, photographs and thermography. This was primarily a treatment for cellulite with a reduction of saddlebag and thigh circumference. At the same time, the treadmill training prevented an increase of body fat, as well as the loss of lean mass. Moreover, thermal images of the temperature modification of the thighs are presented. These positive effects can result in a further improvement of body aesthetics using infrared-LED together with treadmill training.
Resumo:
A novel cryptography method based on the Lorenz`s attractor chaotic system is presented. The proposed algorithm is secure and fast, making it practical for general use. We introduce the chaotic operation mode, which provides an interaction among the password, message and a chaotic system. It ensures that the algorithm yields a secure codification, even if the nature of the chaotic system is known. The algorithm has been implemented in two versions: one sequential and slow and the other, parallel and fast. Our algorithm assures the integrity of the ciphertext (we know if it has been altered, which is not assured by traditional algorithms) and consequently its authenticity. Numerical experiments are presented, discussed and show the behavior of the method in terms of security and performance. The fast version of the algorithm has a performance comparable to AES, a popular cryptography program used commercially nowadays, but it is more secure, which makes it immediately suitable for general purpose cryptography applications. An internet page has been set up, which enables the readers to test the algorithm and also to try to break into the cipher.
Resumo:
This article discusses methods to identify plants by analysing leaf complexity based on estimating their fractal dimension. Leaves were analyzed according to the complexity of their internal and external shapes. A computational program was developed to process, analyze and extract the features of leaf images, thereby allowing for automatic plant identification. Results are presented from two experiments, the first to identify plant species from the Brazilian Atlantic forest and Brazilian Cerrado scrublands, using fifty leaf samples from ten different species, and the second to identify four different species from genus Passiflora, using twenty leaf samples for each class. A comparison is made of two methods to estimate fractal dimension (box-counting and multiscale Minkowski). The results are discussed to determine the best approach to analyze shape complexity based on the performance of the technique, when estimating fractal dimension and identifying plants. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Nowadays, noninvasive methods of diagnosis have increased due to demands of the population that requires fast, simple and painless exams. These methods have become possible because of the growth of technology that provides the necessary means of collecting and processing signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this paper is to characterize healthy and pathological voice signals with the aid of relative entropy measures. Phase space reconstruction technique is also used as a way to select interesting regions of the signals. Three groups of samples were used, one from healthy individuals and the other two from people with nodule in the vocal fold and Reinke`s edema. All of them are recordings of sustained vowel /a/ from Brazilian Portuguese. The paper shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Relative entropy is well suited due to its sensibility to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. The results showed that the pathological groups had higher entropy values in accordance with other vocal acoustic parameters presented. This suggests that these techniques may improve and complement the recent voice analysis methods available for clinicians. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
In chemical analyses performed by laboratories, one faces the problem of determining the concentration of a chemical element in a sample. In practice, one deals with the problem using the so-called linear calibration model, which considers that the errors associated with the independent variables are negligible compared with the former variable. In this work, a new linear calibration model is proposed assuming that the independent variables are subject to heteroscedastic measurement errors. A simulation study is carried out in order to verify some properties of the estimators derived for the new model and it is also considered the usual calibration model to compare it with the new approach. Three applications are considered to verify the performance of the new approach. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The generalized Birnbaum-Saunders (GBS) distribution is a new class of positively skewed models with lighter and heavier tails than the traditional Birnbaum-Saunders (BS) distribution, which is largely applied to study lifetimes. However, the theoretical argument and the interesting properties of the GBS model have made its application possible beyond the lifetime analysis. The aim of this paper is to present the GBS distribution as a useful model for describing pollution data and deriving its positive and negative moments. Based on these moments, we develop estimation and goodness-of-fit methods. Also, some properties of the proposed estimators useful for developing asymptotic inference are presented. Finally, an application with real data from Environmental Sciences is given to illustrate the methodology developed. This example shows that the empirical fit of the GBS distribution to the data is very good. Thus, the GBS model is appropriate for describing air pollutant concentration data, which produces better results than the lognormal model when the administrative target is determined for abating air pollution. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
There are several versions of the lognormal distribution in the statistical literature, one is based in the exponential transformation of generalized normal distribution (GN). This paper presents the Bayesian analysis for the generalized lognormal distribution (logGN) considering independent non-informative Jeffreys distributions for the parameters as well as the procedure for implementing the Gibbs sampler to obtain the posterior distributions of parameters. The results are used to analyze failure time models with right-censored and uncensored data. The proposed method is illustrated using actual failure time data of computers.