972 resultados para pseudo-random number generator
Resumo:
Environment monitoring has an important role in occupational exposure assessment. However, due to several factors is done with insufficient frequency and normally don´t give the necessary information to choose the most adequate safety measures to avoid or control exposure. Identifying all the tasks developed in each workplace and conducting a task-based exposure assessment help to refine the exposure characterization and reduce assessment errors. A task-based assessment can provide also a better evaluation of exposure variability, instead of assessing personal exposures using continuous 8-hour time weighted average measurements. Health effects related with exposure to particles have mainly been investigated with mass-measuring instruments or gravimetric analysis. However, more recently, there are some studies that support that size distribution and particle number concentration may have advantages over particle mass concentration for assessing the health effects of airborne particles. Several exposure assessments were performed in different occupational settings (bakery, grill house, cork industry and horse stable) and were applied these two resources: task-based exposure assessment and particle number concentration by size. The results showed interesting results: task-based approach applied permitted to identify the tasks with higher exposure to the smaller particles (0.3 μm) in the different occupational settings. The data obtained allow more concrete and effective risk assessment and the identification of priorities for safety investments.
Resumo:
Myocardial perfusion gated-single photon emission computed tomography (gated-SPECT) imaging is used for the combined evaluation of myocardial perfusion and left ventricular (LV) function. The aim of this study is to analyze the influence of counts/pixel and concomitantly the total counts in the myocardium for the calculation of myocardial functional parameters. Material and methods: Gated-SPECT studies were performed using a Monte Carlo GATE simulation package and the NCAT phantom. The simulations of these studies use the radiopharmaceutical 99mTc-labeled tracers (250, 350, 450 and 680MBq) for standard patient types, effectively corresponding to the following activities of myocardium: 3, 4.2, 5.4-8.2MBq. All studies were simulated using 15 and 30s/projection. The simulated data were reconstructed and processed by quantitative-gated-SPECT software, and the analysis of functional parameters in gated-SPECT images was done by using Bland-Altman test and Mann-Whitney-Wilcoxon test. Results: In studies simulated using different times (15 and 30s/projection), it was noted that for the activities for full body: 250 and 350MBq, there were statistically significant differences in parameters Motility and Thickness. For the left ventricular ejection fraction (LVEF), end-systolic volume (ESV) it was only for 250MBq, and 350MBq in the end-diastolic volume (EDV), while the simulated studies with 450 and 680MBq showed no statistically significant differences for global functional parameters: LVEF, EDV and ESV. Conclusion: The number of counts/pixel and, concomitantly, the total counts per simulation do not significantly interfere with the determination of gated-SPECT functional parameters, when using the administered average activity of 450MBq, corresponding to the 5.4MBq of the myocardium, for standard patient types.
Resumo:
Epidemiological studies have shown the effect of diet on the incidence of chronic diseases; however, proper planning, designing, and statistical modeling are necessary to obtain precise and accurate food consumption data. Evaluation methods used for short-term assessment of food consumption of a population, such as tracking of food intake over 24h or food diaries, can be affected by random errors or biases inherent to the method. Statistical modeling is used to handle random errors, whereas proper designing and sampling are essential for controlling biases. The present study aimed to analyze potential biases and random errors and determine how they affect the results. We also aimed to identify ways to prevent them and/or to use statistical approaches in epidemiological studies involving dietary assessments.
Resumo:
This paper presents the Pseudo phase plane (PPP) method for detecting the existence of a nanofilm on the nitroazobenzene-modified glassy carbon electrode (NAB-GC) system. This modified electrode systems and nitroazobenze-nanofilm were prepared by the electrochemical reduction of diazonium salt of NAB at the glassy carbon electrodes (GCE) in nonaqueous media. The IR spectra of the bare glassy carbon electrodes (GCE), the NAB-GC electrode system and the organic NAB film were recorded. The IR data of the bare GC, NAB-GC and NAB film were categorized into five series consisting of FILM1, GC-NAB1, GC1; FILM2, GC-NAB2, GC2; FILM3, GC-NAB3, GC3 and FILM4, GC-NAB4, GC4 respectively. The PPP approach was applied to each group of the data of unmodified and modified electrode systems with nanofilm. The results provided by PPP method show the existence of the NAB film on the modified GC electrode.
Resumo:
A 10 kJ electromagnetic forming (EMF) modulator with energy recovery based on two resonant power modules, each containing a 4.5 kV/30-kA silicon controlled rectifier, a 1.11-mF capacitor bank and an energy recovery circuit, working in parallel to allow a maximum actuator discharge current amplitude and rate of 50 kA and 2 kA/mu s was successfully developed and tested. It can be plugged in standard single phase 230 V/16 A mains socket and the circuit is able to recover up to 32% of its initial energy, reducing the charging time of conventional EMF systems by up to 68%.
Resumo:
OBJECTIVE To evaluate the validity and reliability of an instrument that evaluates the structure of primary health care units for the treatment of tuberculosis.METHODS This cross-sectional study used simple random sampling and evaluated 1,037 health care professionals from five Brazilian municipalities (Natal, state of Rio Grande do Norte; Cabedelo, state of Paraíba; Foz do Iguaçu, state of Parana; Sao José do Rio Preto, state of Sao Paulo, and Uberaba, state of Minas Gerais) in 2011. Structural indicators were identified and validated, considering different methods of organization of the health care system in the municipalities of different population sizes. Each structure represented the organization of health care services and contained the resources available for the execution of health care services: physical resources (equipment, consumables, and facilities); human resources (number and qualification); and resources for maintenance of the existing infrastructure and technology (deemed as the organization of health care services). The statistical analyses used in the validation process included reliability analysis, exploratory factor analysis, and confirmatory factor analysis.RESULTS The validation process indicated the retention of five factors, with 85.9% of the total variance explained, internal consistency between 0.6460 and 0.7802, and quality of fit of the confirmatory factor analysis of 0.995 using the goodness-of-fit index. The retained factors comprised five structural indicators: professionals involved in the care of tuberculosis patients, training, access to recording instruments, availability of supplies, and coordination of health care services with other levels of care. Availability of supplies had the best performance and the lowest coefficient of variation among the services evaluated. The indicators of assessment of human resources and coordination with other levels of care had satisfactory performance, but the latter showed the highest coefficient of variation. The performance of the indicators “training” and “access to recording instruments” was inferior to that of other indicators.CONCLUSIONS The instrument showed feasibility of application and potential to assess the structure of primary health care units for the treatment of tuberculosis.
Resumo:
To study a flavour model with a non-minimal Higgs sector one must first define the symmetries of the fields; then identify what types of vacua exist and how they may break the symmetries; and finally determine whether the remnant symmetries are compatible with the experimental data. Here we address all these issues in the context of flavour models with any number of Higgs doublets. We stress the importance of analysing the Higgs vacuum expectation values that are pseudo-invariant under the generators of all subgroups. It is shown that the only way of obtaining a physical CKM mixing matrix and, simultaneously, non-degenerate and non-zero quark masses is requiring the vacuum expectation values of the Higgs fields to break completely the full flavour group, except possibly for some symmetry belonging to baryon number. The application of this technique to some illustrative examples, such as the flavour groups Delta (27), A(4) and S-3, is also presented.
Resumo:
This paper presents a biased random-key genetic algorithm for the resource constrained project scheduling problem. The chromosome representation of the problem is based on random keys. Active schedules are constructed using a priority-rule heuristic in which the priorities of the activities are defined by the genetic algorithm. A forward-backward improvement procedure is applied to all solutions. The chromosomes supplied by the genetic algorithm are adjusted to reflect the solutions obtained by the improvement procedure. The heuristic is tested on a set of standard problems taken from the literature and compared with other approaches. The computational results validate the effectiveness of the proposed algorithm.
Resumo:
Invariant integrals are derived for nematic liquid crystals and applied to materials with small Ericksen number and topological defects. The nematic material is confined between two infinite plates located at y = -h and y = h (h is an element of R+) with a semi-infinite plate at y = 0 and x < 0. Planar and homeotropic strong anchoring boundary conditions to the director field are assumed at these two infinite and semi-infinite plates, respectively. Thus, a line disclination appears in the system which coincides with the z-axis. Analytical solutions to the director field in the neighbourhood of the singularity are obtained. However, these solutions depend on an arbitrary parameter. The nematic elastic force is thus evaluated from an invariant integral of the energy-momentum tensor around a closed surface which does not contain the singularity. This allows one to determine this parameter which is a function of the nematic cell thickness and the strength of the disclination. Analytical solutions are also deduced for the director field in the whole region using the conformal mapping method. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Applied Mathematical Modelling, Vol.33
Resumo:
European Transactions on Telecommunications, vol. 18
Resumo:
Many learning problems require handling high dimensional datasets with a relatively small number of instances. Learning algorithms are thus confronted with the curse of dimensionality, and need to address it in order to be effective. Examples of these types of data include the bag-of-words representation in text classification problems and gene expression data for tumor detection/classification. Usually, among the high number of features characterizing the instances, many may be irrelevant (or even detrimental) for the learning tasks. It is thus clear that there is a need for adequate techniques for feature representation, reduction, and selection, to improve both the classification accuracy and the memory requirements. In this paper, we propose combined unsupervised feature discretization and feature selection techniques, suitable for medium and high-dimensional datasets. The experimental results on several standard datasets, with both sparse and dense features, show the efficiency of the proposed techniques as well as improvements over previous related techniques.
Resumo:
Swarm Intelligence (SI) is the property of a system whereby the collective behaviors of (unsophisticated) agents interacting locally with their environment cause coherent functional global patterns to emerge. Particle swarm optimization (PSO) is a form of SI, and a population-based search algorithm that is initialized with a population of random solutions, called particles. These particles are flying through hyperspace and have two essential reasoning capabilities: their memory of their own best position and knowledge of the swarm's best position. In a PSO scheme each particle flies through the search space with a velocity that is adjusted dynamically according with its historical behavior. Therefore, the particles have a tendency to fly towards the best search area along the search process. This work proposes a PSO based algorithm for logic circuit synthesis. The results show the statistical characteristics of this algorithm with respect to number of generations required to achieve the solutions. It is also presented a comparison with other two Evolutionary Algorithms, namely Genetic and Memetic Algorithms.
Resumo:
A nationwide seroepidemiologic survey of human T. cruzi infection was carried out in Brazil from 1975 to 1980 as a joint programme of the Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) and the Superintendência de Campanhas (SUGAM), Ministry of Health, of Brazil. Due to the marked heterogeneity of urban populations as result of wide migratory movements in the country and since triatomine transmission of the disease occurs mostly in rural areas, the survey was limited to rural populations. The survey was based on a large cluster sampling of complete households, from randomly selected localities comprised of 10 to 500 houses, or up to 200 houses in the Amazon region. Random selection of localities and houses was permitted by a detailed mapping of every locality in the country, as performed and continuously adjusted, by SUCAM. In the selected houses duplicate samples on filter paper were collected from every resistent 1 year or older. Samples were tested in one of 14 laboratories scattered in the country by the indirect anti-IgG immunofluorescence test, with reagents produced and standardized by a central laboratory located at the Instituto de Medicina Tropical de São Paulo. A continuous quality control was performed at this laboratory, which tested duplicates of 10% to 15% of all samples examined by the collaborating laboratories. Data regarding number of sera collected, patients'age, sex, place of residence, place of birth and test result were computerized at the Department of Preventive Medicine, Medical School, University of São Paulo, São Paulo, Brazil. Serologic prevalence indices were estimated for each Municipality and mapped according to States and Territories in Brazil. Since data were already available for the State of São Paulo and the Federal District, these unities were not included in the survey.