887 resultados para Full scale testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The description of the small-scale fisheries in the middle River Tocantins and their dynamics, in a full year cycle, was made out using questionnaires when interviewing fishermen at Imperatriz market. The landed fish exhibited a seasonal pattern related to the hydrological cycle, fishing effort and species diversity. Curimata, Prochilodus nigricans Agassiz, is the most important commercial fish. The fishing gears used in the area are castnets, gillnets, longlines and beach seines. Thirty-six percent of fish landed were caught exclusively by beach seine which mainly targets curimata. Alterations in the physical and biological characteristics of the middle River Tocantins due to the building of the Tucurui dam (2850 km(2) total area) allowed the mapara, Hipophthalmus marginatus Valenciennes, to colonise this area. During the closed season for nets, Siluriformes are the target of the fisheries, caught exclusively by longlining.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the basic hypotheses which motivate the statistical framework used to analyze the cosmic microwave background, and how that framework can be enlarged as we relax those hypotheses. In particular, we try to separate as much as possible the questions of gaussianity, homogeneity, and isotropy from each other. We focus both on isotropic estimators of nongaussianity as well as statistically anisotropic estimators of gaussianity, giving particular emphasis on their signatures and the enhanced cosmic variances that become increasingly important as our putative Universe becomes less symmetric. After reviewing the formalism behind some simple model-independent tests, we discuss how these tests can be applied to CMBdata when searching for large-scale anomalies. Copyright © 2010 L. Raul Abramo and Thiago S. Pereira.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing use of sensitive loads in the electric power system, especially in industrial applications, increases voltage sags related production losses considerably, stimulating a demand for power electronics' based solutions to mitigate the effects of such problems. This paper shows the implementation and some industrial certification tests of a power equipment prototype designed to correct sags and swells, a dynamic voltage restorer, which is one of the many possible solutions for voltage sags and swells problems Experimental results of a 75kVA prototype are shown both in laboratory and full load conditions, in a certification institution (IEE-USP). © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, experimental results are reported for a small scale cogeneration plant for power and refrigeration purposes. The plant includes a natural gas microturbine and an ammonia/water absorption chiller fired by steam. The system was tested under different turbine loads, steam pressures and chiller outlet temperatures. An evaluation based on the 1st and 2nd Laws of Thermodynamics was also performed. For the ambient temperature around 24°C and microturbine at full load, the plant is able to provide 19 kW of saturated steam at 5.3 bar (161 °C), corresponding to 9.2 kW of refrigeration at -5 °C (COP = 0.44). From a 2nd law point-of-view, it was found that there is an optimal chiller outlet temperature that maximizes the chiller exergetic efficiency. As expected, the microturbine presented the highest irreversibilities, followed by the absorption chiller and the HRSG. In order to reduce the plant exergy destruction, it is recommended a new design for the HRSG and a new insulation for the exhaust pipe. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Piezoelectric array transducers applications are becoming usual in the ultrasonic non-destructive testing area. However, the number of elements can increase the system complexity, due to the necessity of multichannel circuitry and to the large amount of data to be processed. Synthetic aperture techniques, where one or few transmission and reception channels are necessary, and the data are post-processed, can be used to reduce the system complexity. Another possibility is to use sparse arrays instead of a full-populated array. In sparse arrays, there is a smaller number of elements and the interelement spacing is larger than half wavelength. In this work, results of ultrasonic inspection of an aluminum plate with artificial defects using guided acoustic waves and sparse arrays are presented. Synthetic aperture techniques are used to obtain a set of images that are then processed with an image compounding technique, which was previously evaluated only with full-populated arrays, in order to increase the resolution and contrast of the images. The results with sparse arrays are equivalent to the ones obtained with full-populated arrays in terms of resolution. Although there is an 8 dB contrast reduction when using sparse arrays, defect detection is preserved and there is the advantage of a reduction in the number of transducer elements and data volume. © 2013 Brazilian Society for Automatics - SBA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The best irrigation management depends on accurate estimation of reference evapotranspiration (ET0) and then selection of the appropriate crop coefficient for each phenological stage. However, the evaluation of water productivity on a large scale can be done by using actual evapotranspiration (ETa), determined by coupling agrometeorological and remote sensing data. This paper describes methodologies used for estimating ETa for 20 centerpivots using three different approaches: the traditional FAO crop coefficient (K-c) method and two remote sensing algorithms, one called SEBAL and other named TEIXEIRA. The methods were applied to one Landsat 5 Thematic Mapper image acquired in July 2010 over the Northwest portion of the Sao Paulo State, Brazil. The corn, bean and sugar cane crops are grown under center pivot sprinkler irrigation. ET0 was calculated by the Penman-Monteith method with data from one automated weather station close to the study site. The results showed that for the crops at effective full cover, SEBAL and TEIXEIRA's methods agreed well comparing with the traditional method. However, both remote sensing methods overestimated ETa according to the degree of exposed soil, with the TEIXEIRA method presenting closer ETa values with those resulted from the traditional FAO K-c method. This study showed that remote sensing algorithms can be useful tools for monitoring and establishing realistic K-c values to further determine ETa on a large scale. However, several images during the growing seasons must be used to establish the necessary adjustments to the traditional FAO crop coefficient method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: This study compared the biomechanical fixation and bone-to-implant contact (BIC) of implants with different surfaces treatment (experimental resorbable blasting media-processed nanometer roughness scale surface, and control dual acid-etched) in a dog model. Material and methods: Surface characterization was made in six implants by means of scanning electron microscopic imaging, atomic force microscopy to evaluate roughness parameters, and X-ray photoelectron spectroscopy (XPS) for chemical assessment. The animal model comprised the bilateral placement of control (n = 24) and experimental surface (n = 24) implants along the proximal tibiae of six mongrel dogs, which remained in place for 2 or 4 weeks. Half of the specimens were biomechanically tested (torque), and the other half was subjected to histomorphologic/ morphometric evaluation. BIC and resistance to failure measures were each evaluated as a function of time and surface treatment in a mixed model ANOVA. Results: Surface texturing was significantly higher for the experimental compared with the control surface. The survey XPS spectra detected O, C, Al, and Ti at the control group, and Ca (similar to 0.2-0.9%) and P (similar to 1.7-4.1%) besides O, C, Al, and Ti at experimental surfaces. While no statistical difference in BIC was found between experimental and control surfaces or between 2 and 4 weeks in vivo, both longer time and use of experimental surface significantly increased resistance to failure. Conclusions: The experimental surface resulted in enhanced biomechanical fixation but comparable BIC relative to control, suggesting higher bone mechanical properties around the experimental implants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the Fisher matrix for counts in cells, we derive the full Fisher matrix for surveys of multiple tracers of large-scale structure. The key step is the classical approximation, which allows us to write the inverse of the covariance of the galaxy counts in terms of the naive matrix inverse of the covariance in a mixed position-space and Fourier-space basis. We then compute the Fisher matrix for the power spectrum in bins of the 3D wavenumber , the Fisher matrix for functions of position (or redshift z) such as the linear bias of the tracers and/or the growth function and the cross-terms of the Fisher matrix that expresses the correlations between estimations of the power spectrum and estimations of the bias. When the bias and growth function are fully specified, and the Fourier-space bins are large enough that the covariance between them can be neglected, the Fisher matrix for the power spectrum reduces to the widely used result that was first derived by Feldman, Kaiser & Peacock. Assuming isotropy, a fully analytical calculation of the Fisher matrix in the classical approximation can be performed in the case of a constant-density, volume-limited survey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare gross motor development of preterm infants (PT) without cerebral palsy with healthy full-term (FT) infants, according to Alberta Infant Motor Scale (AIMS); to compare the age of walking between PT and FT; and whether the age of walking in PT is affected by neonatal variables. Methods: Prospective study compared monthly 101 PT and 52 FT, from the first visit, until all AIMS items had been observed. Results: Mean scores were similarity in their progression, except from the eighth to tenth months. FT infants were faster in walking attainment than PT. Birth weight and length and duration of neonatal nursery stay were related to walking delay. Conclusion: Gross motor development between PT and FT were similar, except from the eighth to tenth months of age. PT walked later than FT infants and predictive variables were birth weight and length, and duration of neonatal intensive unit stay.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To describe the Brainstem Auditory Evoked Potential (BAEP) results of full-term small-for-gestational-age newborns, comparing them to the results of full-term appropriate-for-gestational-age newborns, in order to verify whether the small-for-gestational-age condition is a risk indicator for retrocochlear hearing impairment. METHODS: This multicentric prospective cross-sectional study assessed 86 full-term newborns - 47 small- (Study Group) and 39 appropriate-for-gestational-age (Control Group - of both genders, with ages between 2 and 12 days. Newborns with presence of transient evoked otoacoustic emissions and type A tympanometry were included in the study. Quantitative analysis was based on the mean and standard deviation of the absolute latencies of waves I, III and V and interpeak intervals I-III, III-V and I-V, for each group. For qualitative analysis, the BAEP results were classified as normal or altered by analyzing these data considering the age range of the newborn at the time of testing. RESULTS: In the Study Group, nine of the 18 (38%) subjects with altered BAEP results had the condition of small-for-gestational-age as the only risk factor for hearing impairments. In the Control Group, seven (18%) had altered results. Female subjects from the Study Group tended to present more central alterations. In the Control Group, the male group tended to have more alterations. CONCLUSION: Full-term children born small or appropriate for gestational age might present transitory or permanent central hearing impairments, regardless of the presence of risk indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The seminal work of Horn and Schunck [8] is the first variational method for optical flow estimation. It introduced a novel framework where the optical flow is computed as the solution of a minimization problem. From the assumption that pixel intensities do not change over time, the optical flow constraint equation is derived. This equation relates the optical flow with the derivatives of the image. There are infinitely many vector fields that satisfy the optical flow constraint, thus the problem is ill-posed. To overcome this problem, Horn and Schunck introduced an additional regularity condition that restricts the possible solutions. Their method minimizes both the optical flow constraint and the magnitude of the variations of the flow field, producing smooth vector fields. One of the limitations of this method is that, typically, it can only estimate small motions. In the presence of large displacements, this method fails when the gradient of the image is not smooth enough. In this work, we describe an implementation of the original Horn and Schunck method and also introduce a multi-scale strategy in order to deal with larger displacements. For this multi-scale strategy, we create a pyramidal structure of downsampled images and change the optical flow constraint equation with a nonlinear formulation. In order to tackle this nonlinear formula, we linearize it and solve the method iteratively in each scale. In this sense, there are two common approaches: one that computes the motion increment in the iterations, like in ; or the one we follow, that computes the full flow during the iterations, like in. The solutions are incrementally refined ower the scales. This pyramidal structure is a standard tool in many optical flow methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.