947 resultados para method comparison
Resumo:
OBJECTIVE: The aims of this study were to evaluate the role of high resolution computed tomography of the torax in detecting abnormalities in chronic asthmatic patients and to determine the behavior of these lesions after at least one year. METHOD: Fourteen persistent asthmatic patients with a mean forced expiratory volume in 1-second that was 63% of predicted and a mean forced expiratory volume in 1-second /forced vital capacity of 60% had two high resolution computed tomographys separated by an interval of at least one year. RESULTS: All 14 patients had abnormalities on both scans. The most common abnormality was bronchial wall thickening, which was present in all patients on both computed tomographys. Bronchiectasis was suggested on the first computed tomography in 5 of the 14 (36%) patients, but on follow-up, the bronchial dilatation had disappeared in 2 and diminished in a third. Only one patient had any emphysematous changes; a minimal persistent area of paraseptal emphysema was present on both scans. In 3 patients, a "mosaic" appearance was observed on the first scan, and this persisted on the follow-up computed tomography. Two patients had persistent areas of mucoid impaction. In a third patient, mucus plugging was detected only on the second computed tomography. CONCLUSIONS: We conclude that there are many abnormalities on the high resolution computed tomography of patients with persistent asthma. Changes suggestive of bronchiectasis, namely bronchial dilatation, frequently resolve spontaneously. Therefore, the diagnosis of bronchiectasis by high resolution computed tomography in asthmatic patients must be made with caution, since bronchial dilatation can be reversible or can represent false dilatation. Nonsmoking chronic asthmatic subjects in this study had no evidence of centrilobular or panacinar emphysema.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Resumo:
In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations
Resumo:
The building sector is one of the Europeâ s main energy consumer, making buildings an important target for a wiser energy use, improving indoor comfort conditions and reducing the energy consumption. To achieve the European Union targets for energy consumption and carbon reductions it is crucial to act in new, but also in existing buildings, which constitute the majority of the building stock. In existing buildings, the significant improvement of their efficiency requires important investments. Therefore, costs are a major concern in the decision making process and the analysis of the cost effectiveness of the interventions is an important path in the guidance for the selection of the different renovation scenarios. The Portuguese thermal legislation considers the simple payback method for the calculations of the time for the return of the investment. However, this method does not take into consideration inflation, cash flows and cost of capital, as well as the future costs of energy and the building elements lifetime as it happens in a life cycle cost analysis. In order to understand the impact of the economic analysis method used in the choice of the renovation measures, a case study has been analysed using simple payback calculations and life cycle costs analysis. Overall results show that less far-reaching renovation measures are indicated when using the simple payback calculations which may be leading to solutions less cost-effective in a long run perspective.
Resumo:
Extreme value theory (EVT) deals with the occurrence of extreme phenomena. The tail index is a very important parameter appearing in the estimation of the probability of rare events. Under a semiparametric framework, inference requires the choice of a number k of upper order statistics to be considered. This is the crux of the matter and there is no definite formula to do it, since a small k leads to high variance and large values of k tend to increase the bias. Several methodologies have emerged in literature, specially concerning the most popular Hill estimator (Hill, 1975). In this work we compare through simulation well-known procedures presented in Drees and Kaufmann (1998), Matthys and Beirlant (2000), Beirlant et al. (2002) and de Sousa and Michailidis (2004), with a heuristic scheme considered in Frahm et al. (2005) within the estimation of a different tail measure but with a similar context. We will see that the new method may be an interesting alternative.
Resumo:
Objective: The aim of this study is to improve the understanding of self-changes after an intervention for depression focused on implicative dilemmas, a type of cognitive conflict related to identity. As recent research has highlighted the relevance of identity-related dilemmas in clients with depression, we sought to assess the way in which clients resolve such inner conflicts after a tailored dilemma-focused intervention and how this is reflected in the clients’ self-narratives. Method: We used three instruments to observe differences between good (n = 5) and poor (n = 5) outcome cases: (i) the Repertory Grid Technique to track the resolution of dilemmas, (ii) the Change Interview to compile clients’ accounts of changes at posttreatment, and (iii) the Innovative Moments Coding System to examine the emergence of clients’ novelties at the Change Interview. Results: Groups did not differ in terms of the number and relevance of client-identified significantly helpful events. However, between-group differences were found for the resolution of dilemmas and for the proportion of high-level innovative moment (IM) types. Furthermore, a greater self-narrative reconstruction was associated with higher levels of symptom improvement. Conclusions: Good outcome cases seem to be associated with the resolution of conflicts and high-level IMs.
Resumo:
In previous work we have applied the environmental multi-region input-output (MRIO) method proposed by Turner et al (2007) to examine the ‘CO2 trade balance’ between Scotland and the Rest of the UK. In McGregor et al (2008) we construct an interregional economy-environment input-output (IO) and social accounting matrix (SAM) framework that allows us to investigate methods of attributing responsibility for pollution generation in the UK at the regional level. This facilitates analysis of the nature and significance of environmental spillovers and the existence of an environmental ‘trade balance’ between regions. While the existence of significant data problems mean that the quantitative results of this study should be regarded as provisional, we argue that the use of such a framework allows us to begin to consider questions such as the extent to which a devolved authority like the Scottish Parliament can and should be responsible for contributing to national targets for reductions in emissions levels (e.g. the UK commitment to the Kyoto Protocol) when it is limited in the way it can control emissions, particularly with respect to changes in demand elsewhere in the UK. However, while such analysis is useful in terms of accounting for pollution flows in the single time period that the accounts relate to, it is limited when the focus is on modelling the impacts of any marginal change in activity. This is because a conventional demand-driven IO model assumes an entirely passive supply-side in the economy (i.e. all supply is infinitely elastic) and is further restricted by the assumption of universal Leontief (fixed proportions) technology implied by the use of the A and multiplier matrices. In this paper we argue that where analysis of marginal changes in activity is required, a more flexible interregional computable general equilibrium approach that models behavioural relationships in a more realistic and theory-consistent manner, is more appropriate and informative. To illustrate our analysis, we compare the results of introducing a positive demand stimulus in the UK economy using both IO and CGE interregional models of Scotland and the rest of the UK. In the case of the latter, we demonstrate how more theory consistent modelling of both demand and supply side behaviour at the regional and national levels affect model results, including the impact on the interregional CO2 ‘trade balance’.
Resumo:
Report for the scientific sojourn at the Swiss Federal Institute of Technology Zurich, Switzerland, between September and December 2007. In order to make robots useful assistants for our everyday life, the ability to learn and recognize objects is of essential importance. However, object recognition in real scenes is one of the most challenging problems in computer vision, as it is necessary to deal with difficulties. Furthermore, in mobile robotics a new challenge is added to the list: computational complexity. In a dynamic world, information about the objects in the scene can become obsolete before it is ready to be used if the detection algorithm is not fast enough. Two recent object recognition techniques have achieved notable results: the constellation approach proposed by Lowe and the bag of words approach proposed by Nistér and Stewénius. The Lowe constellation approach is the one currently being used in the robot localization project of the COGNIRON project. This report is divided in two main sections. The first section is devoted to briefly review the currently used object recognition system, the Lowe approach, and bring to light the drawbacks found for object recognition in the context of indoor mobile robot navigation. Additionally the proposed improvements for the algorithm are described. In the second section the alternative bag of words method is reviewed, as well as several experiments conducted to evaluate its performance with our own object databases. Furthermore, some modifications to the original algorithm to make it suitable for object detection in unsegmented images are proposed.
Resumo:
Lean meat percentage (LMP) is the criterion for carcass classification and it must be measured on line objectively. The aim of this work was to compare the error of the prediction (RMSEP) of the LMP measured with the following different devices: Fat-O-Meat’er (FOM), UltraFOM (UFOM), AUTOFOM and -VCS2000. For this reason the same 99 carcasses were measured using all 4 apparatus and dissected according to the European Reference Method. Moreover a subsample of the carcasses (n=77) were fully scanned with a X-ray Computed Tomography equipment (CT). The RMSEP calculated with cross validation leave-one-out was lower for FOM and AUTOFOM (1.8% and 1.9%, respectively) and higher for UFOM and VCS2000 (2.3% for both devices). The error obtained with CT was the lowest (0.96%) in accordance with previous results, but CT cannot be used on line. It can be concluded that FOM and AUTOFOM presented better accuracy than UFOM and VCS2000.
Resumo:
Immunofluorescence and immunoperoxidase test directed against early viral antigens, and DNA-DNA hybridization were compared with viral isolation for their abilities to detect Cytomegalovirus (CVM) in the urine of 89 HIV infected patients. From the 100 urine samples collected, 70 were found positive by at least one method. Considering viral isolation as the "gold standard" technique, immunofluorescence and immunoperoxidase had a sensitivity of 92.3% and88% respectively, with a specificity in both cases of 95%. DNA-DNA hybridization showed a sensitivity of 90% but with lower (60%) specificity. All of the three assays were effective in detecting CVM from urine and the technical advantage of each is discussed.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
We would like to comment this study recently published in JFS. It is a short technical note proposing an artificial aging technique for the dating of ballpoint pen inks. This is a very difficult and controversial topic, and we are worried about the nature of this paper. The authors propose several ideas to differentiate fast aging and slow aging inks, but their experimental data is not validly represented and/or discussed. The data is insufficient to draw any conclusions about any potential of the method for ink dating purposes. This lack of information on the subject must be filled before proposing such methods for practical caseworks. These are preliminary and unconvincing results from development research performed in a laboratory on controlled samples without due warnings about potential shortcomings. They cannot be used or even compared to results obtained in real situations on uncontrolled specimens of limited size, unknown composition and undefined storage conditions. This can leave an undeserved feeling that these methods are ready for implementation when the task of ensuring their scientific validity is still far away.
Resumo:
The ancient Greek medical theory based on balance or imbalance of humors disappeared in the western world, but does survive elsewhere. Is this survival related to a certain degree of health care efficiency? We explored this hypothesis through a study of classical Greco-Arab medicine in Mauritania. Modern general practitioners evaluated the safety and effectiveness of classical Arabic medicine in a Mauritanian traditional clinic, with a prognosis/follow-up method allowing the following comparisons: (i) actual patient progress (clinical outcome) compared with what the traditional 'tabib' had anticipated (= prognostic ability) and (ii) patient progress compared with what could be hoped for if the patient were treated by a modern physician in the same neighborhood. The practice appeared fairly safe and, on average, clinical outcome was similar to what could be expected with modern medicine. In some cases, patient progress was better than expected. The ability to correctly predict an individual's clinical outcome did not seem to be better along modern or Greco-Arab theories. Weekly joint meetings (modern and traditional practitioners) were spontaneously organized with a modern health centre in the neighborhood. Practitioners of a different medical system can predict patient progress. For the patient, avoiding false expectations with health care and ensuring appropriate referral may be the most important. Prognosis and outcome studies such as the one presented here may help to develop institutions where patients find support in making their choices, not only among several treatment options, but also among several medical systems.
Resumo:
At present, most Neisseria gonorrhoeae testing is done with ß-lactamase and agar dilution tests with common therapeutic agents. Generally, in bacteriological diagnosis laboratories in Argentina, study of antibiotic susceptibility of N.gonorrhoeae is based on ß-lactamase determination and agar dilution method with common therapeutic agents. The National Committee for Clinical Laboratory Standards (NCCLS) has recently described a disk diffusion test that produces results comparable to the reference agar dilution method for antibiotic susceptibility of N.gonorrhoeae, using a dispersion diagram for analyzing the correlation between both techniques. We obtained 57 gonococcal isolates from patients attending a clinic for sexually transmitted diseases in Tucumán, Argentina. Antibiotic susceptibility tests using agar dilution and disk diffusion techniques were compared. The established NCCLS interpretive criteria for both susceptibility methods appeared to be applicable to domestic gonococcal strains. The correlation between the MIC's and the zones of inhibition was studied for penicillin, ampicillin, cefoxitin, spectinomycin, cefotaxime, cephaloridine, cephalexin, tetracycline, norfloxacin and kanamycin. Dispersion diagrams showed a high correlation between both methods.
Resumo:
Since 1984, DNA tests based on the highly repeated subtelomeric sequences of Plasmodium falciparum (rep 20) have been frequently used in malaria diagnosis. Rep 20 is very specific for this parasite, and is made of 21 bp units, organized in repeated blocks with direct and inverted orientation. Based in this particular organization, we selected a unique consensus oligonucleotide (pf-21) to drive a PCR reaction coupled to hybridization to non-radioactive labeled probes. The pf-21 unique oligo PCR (pf-21-I) assay produced DNA amplification fingerprints when was applied on purified P. falciparum DNA samples (Brazil and Colombia), as well as in patient's blood samples from a large area of Venezuela. The performance of the Pf-21-I assay was compared against Giemsa stained thick blood smears from samples collected at a malaria endemic area of the Bolívar State, Venezuela, at the field station of Malariología in Tumeremo. Coupled to non-radioactive hybridization the pf-21-I performed better than the traditional microscopic method with a r=1.7:1. In the case of mixed infections the r value of P. falciparum detection increased to 2.5:1. The increased diagnostic sensitivity of the test produced with this homologous oligonucleotide could provide an alternative to the epidemiological diagnosis of P. falciparum being currently used in Venezuela endemic areas, where low parasitemia levels and asymptomatic malaria are frequent. In addition, the DNA fingerprint could be tested in molecular population studies