857 resultados para Management Misperceptions: An Obstacle to Motivation
Resumo:
Background: The main function of the mucociliary system is the removal of particles or substances that are potentially harmful to the respiratory tract. The tuning fork therapeutic for the purpose of bronchial hygiene has still not been described in the literature. The optimal vibration frequency to mobilize secretions is widely debated and varies between 3 and 25 Hz. It is expected that a tuning fork is able to generate vibrations in the thorax, facilitating bronchial hygiene. The aim of the present study is to develop tuning forks with different frequencies, for use in bronchopulmonary hygiene therapy. Methods: The first tuning fork was made with a fixed frequency of 25 Hz and it was recorded in the Brazilian institution of patent registration. This device generated a frequency of 25 Hz and had a weight of 521 g, with dimensions of 600 mm in total length. The device is characterized by a bottom end containing a transducer with a diameter of 62 mm and a thickness of 5/16 mm (8''), a rod removable 148 mm, fork length of 362 mm and an extension at the upper end of sinuous shape bilaterally.The tuning forks must be applied at an angle of 90° directly on the chest wall of the patient after pulmonary auscultation for location of secretions. The tuning fork is activated by squeezing the tips of the extensions together and releasing them in a sudden movement. Results: This study shows the result of the development of others three tuning forks of different dimensions to generate different frequencies. Each equipment reaches a fixed frequency preset of 12, 15 and 20 Hz measured by digital oscilloscope. Conclusions: The tuning fork models developed in this study generated different frequencies proposed by the scientific literature as effective in the mobilization of pulmonary secretions.
Resumo:
Prostate cancer is a serious public health problem accounting for up to 30% of clinical tumors in men. The diagnosis of this disease is made with clinical, laboratorial and radiological exams, which may indicate the need for transrectal biopsy. Prostate biopsies are discerningly evaluated by pathologists in an attempt to determine the most appropriate conduct. This paper presents a set of techniques for identifying and quantifying regions of interest in prostatic images. Analyses were performed using multi-scale lacunarity and distinct classification methods: decision tree, support vector machine and polynomial classifier. The performance evaluation measures were based on area under the receiver operating characteristic curve (AUC). The most appropriate region for distinguishing the different tissues (normal, hyperplastic and neoplasic) was defined: the corresponding lacunarity values and a rule's model were obtained considering combinations commonly explored by specialists in clinical practice. The best discriminative values (AUC) were 0.906, 0.891 and 0.859 between neoplasic versus normal, neoplasic versus hyperplastic and hyperplastic versus normal groups, respectively. The proposed protocol offers the advantage of making the findings comprehensible to pathologists. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The algorithm creates a buffer area around the cartographic features of interest in one of the images and compare it with the other one. During the comparison, the algorithm calculates the number of equals and different points and uses it to calculate the statistical values of the analysis. One calculated statistical value is the correctness, which shows the user the percentage of points that were correctly extracted. Another one is the completeness that shows the percentage of points that really belong to the interest feature. And the third value shows the idea of quality obtained by the extraction method, since that in order to calculate the quality the algorithm uses the correctness and completeness previously calculated. All the performed tests using this algorithm were possible to use the statistical values calculated to represent quantitatively the quality obtained by the extraction method executed. So, it is possible to say that the developed algorithm can be used to analyze extraction methods of cartographic features of interest, since that the results obtained were promising.
Resumo:
Wireless networks are widely deployed and have many uses, for example in critical embedded systems. The applications of this kind of network meets the common needs of most embedded systems and addressing the particularities of each scenario, such as limitations of computing resources and energy supply. Problems such as denial of service attacks are common place and cause great inconvenience. Thus, this study presents simulations of denial of service attacks on 802.11 wireless networks using the network simulator OMNeT++. Furthermore, we present an approach to mitigate such attack, obtaining significant results for improving wireless networks.
Analysis of oxy-fuel combustion as an alternative to combustion with air in metal reheating furnaces
Resumo:
Using oxygen instead of air in a burning process is at present being widely discussed as an option to reduce CO2 emissions. One of the possibilities is to maintain the combustion reaction at the same energy release level as burning with air, which reduces fuel consumption and the emission rates of CO2. A thermal simulation was made for metal reheating furnaces, which operate at a temperature in the range of 1150-1250 degrees C, using natural gas with a 5% excess of oxygen, maintaining fixed values for pressure and combustion temperature. The theoretical results show that it is possible to reduce the consumption of fuel, and this reduction depends on the amount of heat that can be recovered during the air pre-heating process. The analysis was further conducted by considering the 2012 costs of natural gas and oxygen in Brazil. The use of oxygen showed to be economically viable for large furnaces that operate with conventional heat recovering systems (those that provide pre-heated air at temperatures near 400 degrees C). (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The increase in new electronic devices had generated a considerable increase in obtaining spatial data information; hence these data are becoming more and more widely used. As well as for conventional data, spatial data need to be analyzed so interesting information can be retrieved from them. Therefore, data clustering techniques can be used to extract clusters of a set of spatial data. However, current approaches do not consider the implicit semantics that exist between a region and an object’s attributes. This paper presents an approach that enhances spatial data mining process, so they can use the semantic that exists within a region. A framework was developed, OntoSDM, which enables spatial data mining algorithms to communicate with ontologies in order to enhance the algorithm’s result. The experiments demonstrated a semantically improved result, generating more interesting clusters, therefore reducing manual analysis work of an expert.
Resumo:
When a physical activity professional is teaching a motor skill, he evaluates the movement's learner and considers which interventions could be done at the moment. However, many times the instructor does not have such resources which could help him/her to evaluate the learner movement. The skill acquisition process could be facilitated if instructors could have an instrument that identifies errors, prioritizing information to be given to the learner. Considering that the specialized literature presents a lack of information about such tool, the purpose of this study was to develop, and to determine the objectivity and reliability of an instrument to assess the movement quality of the basketball free throw shooting. The checklist was developed and evaluated by basketball experts. Additionally, the checklist was used to assess 10 trials (edited video) from four individuals in different learning stages. Data were organized by the critical error and the error sum appointed by the experts in two different occasions (one week interval). Contrasting both evaluations, and also, contrasting different experts assessments, in sum and critical error, it was observed an average error of 16.9%. It was concluded that the checklist to assess the basketball free throw is reliable, and could help instructors to make a qualitative analysis. Moreover, the checklist may allow instructors to make assumptions on the motor learning process.
Resumo:
The objective of the present study is to propose a method to dynamically evaluate discomfort of a passenger seat by measuring the interface pressure between the occupant and the seat during the performance of the most common activities of a typical flight(1). This article reports the results of resting and reading studies performed in a simulator that represents the interior of a commercial aircraft.
Weibull and generalised exponential overdispersion models with an application to ozone air pollution
Resumo:
We consider the problem of estimating the mean and variance of the time between occurrences of an event of interest (inter-occurrences times) where some forms of dependence between two consecutive time intervals are allowed. Two basic density functions are taken into account. They are the Weibull and the generalised exponential density functions. In order to capture the dependence between two consecutive inter-occurrences times, we assume that either the shape and/or the scale parameters of the two density functions are given by auto-regressive models. The expressions for the mean and variance of the inter-occurrences times are presented. The models are applied to the ozone data from two regions of Mexico City. The estimation of the parameters is performed using a Bayesian point of view via Markov chain Monte Carlo (MCMC) methods.
Resumo:
VIBRATIONAL ANALYSIS OF COORDINATION COMPOUNDS OF NICKEL (II): AN APPROACH TO THE TEACHING OF POINT GROUPS. This paper presents an IR and Raman experiment executed during the teaching of the course "Chemical Bonds" for undergraduated students of Science and Technology and Chemistry at the Federal University of ABC, in order to facilitate and encourage the teaching and learning of group theory. Some key aspects of this theory are also outlined. We believe that student learning was more significant with the introduction of this experiment, because there was an increase in the discussions level and in the performance during evaluations. This work also proposes a multidisciplinary approach to include the use of quantum chemistry tools.
Resumo:
We propose a new general Bayesian latent class model for evaluation of the performance of multiple diagnostic tests in situations in which no gold standard test exists based on a computationally intensive approach. The modeling represents an interesting and suitable alternative to models with complex structures that involve the general case of several conditionally independent diagnostic tests, covariates, and strata with different disease prevalences. The technique of stratifying the population according to different disease prevalence rates does not add further marked complexity to the modeling, but it makes the model more flexible and interpretable. To illustrate the general model proposed, we evaluate the performance of six diagnostic screening tests for Chagas disease considering some epidemiological variables. Serology at the time of donation (negative, positive, inconclusive) was considered as a factor of stratification in the model. The general model with stratification of the population performed better in comparison with its concurrents without stratification. The group formed by the testing laboratory Biomanguinhos FIOCRUZ-kit (c-ELISA and rec-ELISA) is the best option in the confirmation process by presenting false-negative rate of 0.0002% from the serial scheme. We are 100% sure that the donor is healthy when these two tests have negative results and he is chagasic when they have positive results.
Resumo:
A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.
Resumo:
The birth of a child with ambiguous genitalia is a challenging and distressing event for the family and physician and one with life-long consequences. Most disorders of sexual differentiation (DSD) associated with ambiguous genitalia are the result either of inappropriate virilization of girls or incomplete virilization of boys. It is important to establish a diagnosis as soon as possible, for psychological, social, and medical reasons, particularly for recognizing accompanying life-threatening disorders such as the salt-losing form of congenital adrenal hyperplasia. In most instances, there is sufficient follow-up data so that making the diagnosis also establishes the appropriate gender assignment (infants with congenital adrenal hyperplasia, those with androgen resistance syndromes), but some causes of DSD such as steroid 5 alpha-reductase 2 deficiency and 17 beta-hydroxysteroid dehydrogenase deficiency are associated with frequent change in social sex later in life. In these instances, guidelines for sex assignment are less well established.
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.