977 resultados para politically correct
Resumo:
Extensive field testes were conducted using the UCD single wheel tester employing three large radial ply tractor tires in two different soils, four different soil conditions, two axle load levels, and three levels of tire inflation pressures in order to quantify the benefits of using low/correct inflation pressures. During these tests slip, net traction, gross traction, and dynamic axle load were recorded. Furthermore, soil moisture content, cone index, and dry bulk density data were obtained at test locations. The results of the analysis showed a significant increase in net traction and traction efficiency when low/correct inflation was used. Benefits of using low/correct pressure was higher in tilled soil conditions.
Resumo:
Modal analysis is widely approached in the classic theory of transmission line modeling. This technique is applied to model the three-phase representation of conventional electric systems taking into account their self and mutual electrical parameters. However the methodology has some particularities and inaccuracies for specific applications which are not clearly described in the basic references of this topic. This paper provides a thorough review of modal analysis theory applied to line models followed by an original and simple procedure to overcome the possible errors embedded in the modal decoupling through the three-phase system modeling. © 2012 IEEE.
Resumo:
For many years, composting has been used as a result of the recycling of organic matter. There is significative animal carcasses accumulation from teaching and researching activities of the university veterinary hospital. Every year, Unesp University needs to dispose correctly about 180 tones of this waste and the composting seemed to be the most sustainable alternative. Piles of animal carcasses were prepared using peanut hulls and tree pruning as bulking agent and water to the first phase of this process. The extracts pH values no impediments for offering germination and indicated a good addition to the soil management. The germination index showed no impediment to the seeds germination on any type of compost and the extracts concentrations not influenced this biological process. No parameters studied assigns risks of contamination of carcasses for the compost development in Unesp according to the proposed design. © 2013 Taylor & Francis Group.
Resumo:
Modal analysis is widely approached in the classic theory of power systems modelling. This technique is also applied to model multiconductor transmission lines and their self and mutual electrical parameters. However, this methodology has some particularities and inaccuracies for specific applications, which are not clearly described in the technical literature. This study provides a brief review on modal decoupling applied in transmission line digital models and thereafter a novel and simplified computational routine is proposed to overcome the possible errors embedded by the modal decoupling in the simulation/ modelling computational algorithm. © The Institution of Engineering and Technology 2013.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
Resumo:
This thesis provides efficient and robust algorithms for the computation of the intersection curve between a torus and a simple surface (e.g. a plane, a natural quadric or another torus), based on algebraic and numeric methods. The algebraic part includes the classification of the topological type of the intersection curve and the detection of degenerate situations like embedded conic sections and singularities. Moreover, reference points for each connected intersection curve component are determined. The required computations are realised efficiently by solving quartic polynomials at most and exactly by using exact arithmetic. The numeric part includes algorithms for the tracing of each intersection curve component, starting from the previously computed reference points. Using interval arithmetic, accidental incorrectness like jumping between branches or the skipping of parts are prevented. Furthermore, the environments of singularities are correctly treated. Our algorithms are complete in the sense that any kind of input can be handled including degenerate and singular configurations. They are verified, since the results are topologically correct and approximate the real intersection curve up to any arbitrary given error bound. The algorithms are robust, since no human intervention is required and they are efficient in the way that the treatment of algebraic equations of high degree is avoided.
Resumo:
Dealing with latent constructs (loaded by reflective and congeneric measures) cross-culturally compared means studying how these unobserved variables vary, and/or covary each other, after controlling for possibly disturbing cultural forces. This yields to the so-called ‘measurement invariance’ matter that refers to the extent to which data collected by the same multi-item measurement instrument (i.e., self-reported questionnaire of items underlying common latent constructs) are comparable across different cultural environments. As a matter of fact, it would be unthinkable exploring latent variables heterogeneity (e.g., latent means; latent levels of deviations from the means (i.e., latent variances), latent levels of shared variation from the respective means (i.e., latent covariances), levels of magnitude of structural path coefficients with regard to causal relations among latent variables) across different populations without controlling for cultural bias in the underlying measures. Furthermore, it would be unrealistic to assess this latter correction without using a framework that is able to take into account all these potential cultural biases across populations simultaneously. Since the real world ‘acts’ in a simultaneous way as well. As a consequence, I, as researcher, may want to control for cultural forces hypothesizing they are all acting at the same time throughout groups of comparison and therefore examining if they are inflating or suppressing my new estimations with hierarchical nested constraints on the original estimated parameters. Multi Sample Structural Equation Modeling-based Confirmatory Factor Analysis (MS-SEM-based CFA) still represents a dominant and flexible statistical framework to work out this potential cultural bias in a simultaneous way. With this dissertation I wanted to make an attempt to introduce new viewpoints on measurement invariance handled under covariance-based SEM framework by means of a consumer behavior modeling application on functional food choices.
Resumo:
The aim was to investigate the effect of different speech tasks, i.e. recitation of prose (PR), alliteration (AR) and hexameter (HR) verses and a control task (mental arithmetic (MA) with voicing of the result on end-tidal CO2 (PETCO2), cerebral hemodynamics and oxygenation. CO2 levels in the blood are known to strongly affect cerebral blood flow. Speech changes breathing pattern and may affect CO2 levels. Measurements were performed on 24 healthy adult volunteers during the performance of the 4 tasks. Tissue oxygen saturation (StO2) and absolute concentrations of oxyhemoglobin ([O2Hb]), deoxyhemoglobin ([HHb]) and total hemoglobin ([tHb]) were measured by functional near-infrared spectroscopy (fNIRS) and PETCO2 by a gas analyzer. Statistical analysis was applied to the difference between baseline before the task, 2 recitation and 5 baseline periods after the task. The 2 brain hemispheres and 4 tasks were tested separately. A significant decrease in PETCO2 was found during all 4 tasks with the smallest decrease during the MA task. During the recitation tasks (PR, AR and HR) a statistically significant (p < 0.05) decrease occurred for StO2 during PR and AR in the right prefrontal cortex (PFC) and during AR and HR in the left PFC. [O2Hb] decreased significantly during PR, AR and HR in both hemispheres. [HHb] increased significantly during the AR task in the right PFC. [tHb] decreased significantly during HR in the right PFC and during PR, AR and HR in the left PFC. During the MA task, StO2 increased and [HHb] decreased significantly during the MA task. We conclude that changes in breathing (hyperventilation) during the tasks led to lower CO2 pressure in the blood (hypocapnia), predominantly responsible for the measured changes in cerebral hemodynamics and oxygenation. In conclusion, our findings demonstrate that PETCO2 should be monitored during functional brain studies investigating speech using neuroimaging modalities, such as fNIRS, fMRI to ensure a correct interpretation of changes in hemodynamics and oxygenation.
Resumo:
We describe a fast and unambiguous method for haplotyping the (TG)mTn repeat in IVS8 and determining three other single nucleotide polymorphisms (SNPs) in exons 10, 14a and 24 in the cystic fibrosis transmembrane conductance regulator (CFTR) gene affecting correct splicing of the CFTR pre-mRNA using primer extension and mass spectrometry. The diagnostic products are generated by primer extension (PEX) reactions, which require a single detection primer complementary to a region downstream of a target strand's variable site. On addition of a polymerase and an appropriate mixture of dNTP's and 2', 3'-dideoxynucleotide triphosphates (ddNTP's), the primer is extended through the mutation region until the first ddNTP is incorporated and the mass of the extension products determines the composition of the variable site. Analysis of patient DNA assigned the correct and unambiguous haplotype for the (TG)mTn repeat in intron 8 of the CFTR gene. Additional crucial SNPs influencing correct splicing in exon 10, 14 and 24 can easily be detected by biplexing the assay to genotype allelic variants important for correct splicing of the CFTR pre-mRNA. Different PEX reactions with subsequent mass spectrometry generate sufficient data, to enable unambiguous and easy haplotyping of the (TG)mTn repeat in the CFTR gene. The method can be easily extended to the inclusion of additional SNPs of interest by biplexing some of the PEX reactions. All experimental steps required for PEX are amenable to the high degree of automation desirable for a high-throughput diagnostic setting, facilitating the work of clinicians involved in the diagnosis of non-classic cystic fibrosis.
Resumo:
An epidural puncture was performed using the lumbosacral approach in 18 dogs, and the lack of resistance to an injection of saline was used to determine that the needle was positioned correctly. The dogs' arterial blood pressure and epidural pressure were recorded. They were randomly assigned to two groups: in one group an injection of a mixture of local anaesthetic agents was made slowly over 90 seconds and in the other it was made over 30 seconds. After 10 minutes contrast radiography was used to confirm the correct placement of the needle. The mean (sd) initial pressure in the epidural space was 0.1 (0.7) kPa. After the injection the mean maximum epidural pressure in the group injected slowly was 5.5 (2.1) kPa and in the group injected more quickly it was 6.0 (1.9) kPa. At the end of the period of measurement, the epidural pressure in the slow group was 0.8 (0.5) kPa and in the rapid group it was 0.7 (0.5) kPa. Waves synchronous with the arterial pulse wave were observed in 15 of the dogs before the epidural injection, and in all the dogs after the epidural injection.
Resumo:
Three dimensional datasets representing scalar fields are frequently rendered using isosurfaces. For datasets arranged as a cubic lattice, the marching cubes algorithm is the most used isosurface extraction method. However, the marching cubes algorithm produces some ambiguities which have been solved using different approaches that normally imply a more complex process. One of them is to tessellate the cubes into tetrahedra, and by using a similar method (marching tetrahedra), to build the isosurface. The main drawback of other tessellations is that they do not produce the same isosurface topologies as those generated by improved marching cubes algorithms. We propose an adaptive tessellation that, being independent of the isovalue, preserves the topology. Moreover the tessellationallows the isosurface to evolve continuously when the isovalue is changed continuously.
Resumo:
The use of pressure waves to confirm the correct position of the epidural needle has been described in several domestic species and proposed as a valid alternative to standard methods, namely, control radiographic exam and fluoroscopy. The object of this retrospective clinical study was to evaluate the sensitivity of the epidural pressure waves as a test to verify the correct needle placement in the epidural space in dogs, in order to determine whether this technique could be useful not only in the clinical setting but also when certain knowledge of needle's tip position is required, for instance when performing clinical research focusing on epidural anaesthesia. Of the 54 client-owned dogs undergoing elective surgeries and enrolled in this retrospective study, only 45% showed epidural pressure waves before and after epidural injection. Twenty-six percent of the animals showed epidural pressure waves only after the injection, whereas 29% of the dogs showed epidural pressure waves neither before nor after injection and were defined as false negatives. Our results show that the epidural pressure wave technique to verify epidural needle position lacks sensitivity, resulting in many false negatives. As a consequence, the applicability of this technique is limited to situations in which precise, exact knowledge of the needle's tip position is not mandatory.