977 resultados para Simple methods
Resumo:
The Electrohysterogram (EHG) is a new instrument for pregnancy monitoring. It measures the uterine muscle electrical signal, which is closely related with uterine contractions. The EHG is described as a viable alternative and a more precise instrument than the currently most widely used method for the description of uterine contractions: the external tocogram. The EHG has also been indicated as a promising tool in the assessment of preterm delivery risk. This work intends to contribute towards the EHG characterization through the inventory of its components which are: • Contractions; • Labor contractions; • Alvarez waves; • Fetal movements; • Long Duration Low Frequency Waves; The instruments used for cataloging were: Spectral Analysis, parametric and non-parametric, energy estimators, time-frequency methods and the tocogram annotated by expert physicians. The EHG and respective tocograms were obtained from the Icelandic 16-electrode Electrohysterogram Database. 288 components were classified. There is not a component database of this type available for consultation. The spectral analysis module and power estimation was added to Uterine Explorer, an EHG analysis software developed in FCT-UNL. The importance of this component database is related to the need to improve the understanding of the EHG which is a relatively complex signal, as well as contributing towards the detection of preterm birth. Preterm birth accounts for 10% of all births and is one of the most relevant obstetric conditions. Despite the technological and scientific advances in perinatal medicine, in developed countries, prematurity is the major cause of neonatal death. Although various risk factors such as previous preterm births, infection, uterine malformations, multiple gestation and short uterine cervix in second trimester, have been associated with this condition, its etiology remains unknown [1][2][3].
Resumo:
Abstract: INTRODUCTION : Molecular analyses are auxiliary tools for detecting Koch's bacilli in clinical specimens from patients with suspected tuberculosis (TB). However, there are still no efficient diagnostic tests that combine high sensitivity and specificity and yield rapid results in the detection of TB. This study evaluated single-tube nested polymerase chain reaction (STNPCR) as a molecular diagnostic test with low risk of cross contamination for detecting Mycobacterium tuberculosis in clinical samples. METHODS: Mycobacterium tuberculosis deoxyribonucleic acid (DNA) was detected in blood and urine samples by STNPCR followed by agarose gel electrophoresis. In this system, reaction tubes were not opened between the two stages of PCR (simple and nested). RESULTS: STNPCR demonstrated good accuracy in clinical samples with no cross contamination between microtubes. Sensitivity in blood and urine, analyzed in parallel, was 35%-62% for pulmonary and 41%-72% for extrapulmonary TB. The specificity of STNPCR was 100% in most analyses, depending on the type of clinical sample (blood or urine) and clinical form of disease (pulmonary or extrapulmonary). CONCLUSIONS: STNPCR was effective in detecting TB, especially the extrapulmonary form for which sensitivity was higher, and had the advantage of less invasive sample collection from patients for whom a spontaneous sputum sample was unavailable. With low risk of cross contamination, the STNPCR can be used as an adjunct to conventional methods for diagnosing TB.
Resumo:
Abstract: INTRODUCTION: Acceptance of the IT LEISH(r) and direct agglutination test- made in the Laboratório de Pesquisas Clínicas (DAT-LPC) by healthcare professionals and patients suspected of visceral leishmaniasis (VL) in Ribeirão das Neves was evaluated. METHODS: Ninety-two patients and 47 professionals completed three questionnaires. RESULTS: Eighty-eight (96%) patients considered fingertip blood collection a positive test feature, and 86% (37) and 91% of professionals considered the IT LEISH(r) easy to perform and interpret, respectively. All professionals classified the DAT-LPC as simple and easy. CONCLUSIONS: Patients and healthcare professionals in Ribeirão das Neves demonstrated a high degree of acceptance of the IT LEISH(r) and DAT-LPC.
Resumo:
The hospital pharmacy in large and advanced institutions has evolved from a simple storage and distribution unit into a highly specialized manipulation and dispensation center, responsible for the handling of hundreds of clinical requests, many of them unique and not obtainable from commercial companies. It was therefore quite natural that in many environments, a manufacturing service was gradually established, to cater to both conventional and extraordinary demands of the medical staff. That was the case of Hospital das Clinicas, where multiple categories of drugs are routinely produced inside the pharmacy. However, cost-containment imperatives dictate that such activities be reassessed in the light of their efficiency and essentiality. METHODS: In a prospective study, the output of the Manufacturing Service of the Central Pharmacy during a 12-month period was documented and classified into three types. Group I comprised drugs similar to commercially distributed products, Group II included exclusive formulations for routine consumption, and Group III dealt with special demands related to clinical investigations. RESULTS: Findings for the three categories indicated that these groups represented 34.4%, 45.3%, and 20.3% of total manufacture orders, respectively. Costs of production were assessed and compared with market prices for Group 1 preparations, indicating savings of 63.5%. When applied to the other groups, for which direct equivalent in market value did not exist, these results would suggest total yearly savings of over 5 100 000 US dollars. Even considering that these calculations leave out many components of cost, notably those concerning marketing and distribution, it might still be concluded that at least part of the savings achieved were real. CONCLUSIONS: The observed savings, allied with the convenience and reliability with which the Central Pharmacy performed its obligations, support the contention that internal manufacture of pharmaceutical formulations was a cost-effective alternative in the described setting.
Resumo:
The study of AC losses in superconducting pancake coils is of utmost importance for the development of superconducting devices. Due to different technical difficulties this study is usually performed considering one of two approaches: considering superconducting coils of few turns and studying AC losses in a large frequency range vs. superconducting coils with a large number of turns but measuring AC losses only in low frequencies. In this work, a study of AC losses in 128 turn superconducting coils is performed, considering frequencies ranging from 50 Hz till 1152 Hz and currents ranging from zero till the critical current of the coils. Moreover, the study of AC losses considering two different simultaneous harmonic components is also performed and results are compared to the behaviour presented by the coils when operating in a single frequency regime. Different electrical methods are used to verify the total amount of AC losses in the coil and a simple calorimetric method is presented, in order to measure AC losses in a multi-harmonic context. Different analytical and numerical methods are implemented and/or used, to design the superconducting coils and to compute the total amount of AC losses in the superconducting system and a comparison is performed to verify the advantages and drawbacks of each method.
Resumo:
OBJECTIVES: To determine the efficacy of a simple, short-term and low-cost eradication treatment for Helicobacter pylori (H. pylori) using omeprazole, tetracycline, and furazolidone in a Brazilian peptic ulcer population, divided into 2 subgroups: untreated and previously treated for the infection. PATIENTS AND METHODS: Patients with peptic ulcer disease diagnosed by endoscopic examination and infected by H. pylori diagnosed by the rapid urease test (RUT) and histological examination, untreated and previously unsuccessfully treated by macrolides and nitroimidazole, were medicated with omeprazole 20 mg daily dose and tetracycline 500 mg and furazolidone 200 mg given 3 times a day for 7 days. Another endoscopy or a breath test was performed 12 weeks after the end of treatment. Patients were considered cured of the infection if a RUT and histologic examination proved negative or a breath test was negative for the bacterium. RESULTS: Sixty-four patients were included in the study. The women were the predominant sex (58%); the mean age was 46 years. Thirty-three percent of the patients were tobacco users, and duodenal ulcer was identified in 80% of patients. For the 59 patients that underwent follow-up examinations, eradication was verified in 44 (75%). The eradication rate for the intention-to-treat group was 69%. The incidence of severe adverse effects was 15%. CONCLUSION: The treatment provides good efficacy for H. pylori eradication in patients who were previously treated without success, but it causes severe adverse effects that prevented adequate use of the medications in 15% of the patients.
Resumo:
Since the financial crisis, risk based portfolio allocations have gained a great deal in popularity. This increase in popularity is primarily due to the fact that they make no assumptions as to the expected return of the assets in the portfolio. These portfolios implicitly put risk management at the heart of asset allocation and thus their recent appeal. This paper will serve as a comparison of four well-known risk based portfolio allocation methods; minimum variance, maximum diversification, inverse volatility and equally weighted risk contribution. Empirical backtests will be performed throughout rising interest rate periods from 1953 to 2015. Additionally, I will compare these portfolios to more simple allocation methods, such as equally weighted and a 60/40 asset-allocation mix. This paper will help to answer the question if these portfolios can survive in a rising interest rate environment.
Resumo:
This thesis proposes a Monte Carlo valuation method for Worst-of Auto-callable equity swaps. The valuation of this type of swap usually requires complex numerical methods which are implemented in “black-box” valuation systems. The method proposed is an alternative benchmark tool that is relatively simple to implement and customize. The performance of the method was evaluated according to the variance and bias of the output and to the accuracy when compared to a leading valuation system in the market.
Resumo:
Grasslands in semi-arid regions, like Mongolian steppes, are facing desertification and degradation processes, due to climate change. Mongolia’s main economic activity consists on an extensive livestock production and, therefore, it is a concerning matter for the decision makers. Remote sensing and Geographic Information Systems provide the tools for advanced ecosystem management and have been widely used for monitoring and management of pasture resources. This study investigates which is the higher thematic detail that is possible to achieve through remote sensing, to map the steppe vegetation, using medium resolution earth observation imagery in three districts (soums) of Mongolia: Dzag, Buutsagaan and Khureemaral. After considering different thematic levels of detail for classifying the steppe vegetation, the existent pasture types within the steppe were chosen to be mapped. In order to investigate which combination of data sets yields the best results and which classification algorithm is more suitable for incorporating these data sets, a comparison between different classification methods were tested for the study area. Sixteen classifications were performed using different combinations of estimators, Landsat-8 (spectral bands and Landsat-8 NDVI-derived) and geophysical data (elevation, mean annual precipitation and mean annual temperature) using two classification algorithms, maximum likelihood and decision tree. Results showed that the best performing model was the one that incorporated Landsat-8 bands with mean annual precipitation and mean annual temperature (Model 13), using the decision tree. For maximum likelihood, the model that incorporated Landsat-8 bands with mean annual precipitation (Model 5) and the one that incorporated Landsat-8 bands with mean annual precipitation and mean annual temperature (Model 13), achieved the higher accuracies for this algorithm. The decision tree models consistently outperformed the maximum likelihood ones.
Resumo:
Since the last decade of the twentieth century, the healthcare industry is paying attention to the environmental impact of their buildings and therefore new regulations, policy goals and Buildings Sustainability Assessment (HBSA) methods are being developed and implemented. At the present, healthcare is one of the most regulated industries and it is also one of the largest consumers of energy per net floor area. To assess the sustainability of healthcare buildings it is necessary to establish a set of benchmarks related with their life-cycle performance. They are both essential to rate the sustainability of a project and to support designers and other stakeholders in the process of designing and operating a sustainable building, by allowing the comparison to be made between a project and the conventional and best market practices. This research is focused on the methodology to set the benchmarks for resources consumption, waste production, operation costs and potential environmental impacts related to the operational phase of healthcare buildings. It aims at contributing to the reduction of the subjectivity found in the definition of the benchmarks used in Building Sustainability Assessment (BSA) methods, and it is applied in the Portuguese context. These benchmarks will be used in the development of a Portuguese HBSA method.
Resumo:
As a renewable energy source, the use of forest biomass for electricity generation is advantageous in comparison with fossil fuels, however the activity of forest biomass power plants causes adverse impacts, affecting particularly neighbouring communities. The main objective of this study is to estimate the effects of the activity of forest biomass power plants on the welfare of two groups of stakeholders, namely local residents and the general population and we apply two stated preference methods: contingent valuation and discrete choice experiments, respectively. The former method was applied to estimate the minimum compensation residents of neighbouring communities of two forest biomass power plants in Portugal would be willing to accept. The latter method was applied among the general population to estimate their willingness to pay to avoid specific environmental impacts. The results show that the presence of the selected facilities affects individuals’ well-being. On the other hand, in the discrete choice experiments conducted among the general population all impacts considered were significant determinants of respondents’ welfare levels. The results of this study stress the importance of performing an equity analysis of the welfare effects on different groups of stakeholders from the installation of forest biomass power plants, as their effects on welfare are location and impact specific. Policy makers should take into account the views of all stakeholders either directly or indirectly involved when deciding crucial issues regarding the sitting of new forest biomass power plants, in order to achieve an efficient and equitable outcome.
Resumo:
To solve a health and safety problem on a waste treatment facility, different multicriteria decision methods were used, including the PROV Exponential decision method. Four alternatives and ten attributes were considered. We found a congruent solution, validated by the different methods. The AHP and the PROV Exponential decision method led us to the same options ordering, but the last method reinforced one of the options as being the best performing one, and detached the least performing option. Also, the ELECTRE I method results led to the same ordering which allowed to point the best solution with reasonable confidence. This paper demonstrates the potential of using multicriteria decision methods to support decision making on complex problems such as risk control and accidents prevention.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Resumo:
Hand gesture recognition for human computer interaction, being a natural way of human computer interaction, is an area of active research in computer vision and machine learning. This is an area with many different possible applications, giving users a simpler and more natural way to communicate with robots/systems interfaces, without the need for extra devices. So, the primary goal of gesture recognition research is to create systems, which can identify specific human gestures and use them to convey information or for device control. For that, vision-based hand gesture interfaces require fast and extremely robust hand detection, and gesture recognition in real time. In this study we try to identify hand features that, isolated, respond better in various situations in human-computer interaction. The extracted features are used to train a set of classifiers with the help of RapidMiner in order to find the best learner. A dataset with our own gesture vocabulary consisted of 10 gestures, recorded from 20 users was created for later processing. Experimental results show that the radial signature and the centroid distance are the features that when used separately obtain better results, with an accuracy of 91% and 90,1% respectively obtained with a Neural Network classifier. These to methods have also the advantage of being simple in terms of computational complexity, which make them good candidates for real-time hand gesture recognition.