80 resultados para Classification Protocols
Resumo:
ABSTRACT Geographic Information System (GIS) is an indispensable software tool in forest planning. In forestry transportation, GIS can manage the data on the road network and solve some problems in transportation, such as route planning. Therefore, the aim of this study was to determine the pattern of the road network and define transport routes using GIS technology. The present research was conducted in a forestry company in the state of Minas Gerais, Brazil. The criteria used to classify the pattern of forest roads were horizontal and vertical geometry, and pavement type. In order to determine transport routes, a data Analysis Model Network was created in ArcGIS using an Extension Network Analyst, allowing finding a route shorter in distance and faster. The results showed a predominance of horizontal geometry classes average (3) and bad (4), indicating presence of winding roads. In the case of vertical geometry criterion, the class of highly mountainous relief (4) possessed the greatest extent of roads. Regarding the type of pavement, the occurrence of secondary coating was higher (75%), followed by primary coating (20%) and asphalt pavement (5%). The best route was the one that allowed the transport vehicle travel in a higher specific speed as a function of road pattern found in the study.
Resumo:
This study aimed to propose methods to identify croplands cultivated with winter cereals in the northern region of Rio Grande do Sul State, Brazil. Thus, temporal profiles of Normalized Difference Vegetation Index (NDVI) from MODIS sensor, from April to December of the 2000 to 2008, were analyzed. Firstly, crop masks were elaborated by subtracting the minimum NDVI image (April to May) from the maximum NDVI image (June to October). Then, an unsupervised classification of NDVI images was carried out (Isodata), considering the crop mask areas. According to the results, crop masks allowed the identification of pixels with greatest green biomass variation. This variation might be associated or not with winter cereals areas established to grain production. The unsupervised classification generated classes in which NDVI temporal profiles were associated with water bodies, pastures, winter cereals for grain production and for soil cover. Temporal NDVI profiles of the class winter cereals for grain production were in agree with crop patterns in the region (developmental stage, management standard and sowing dates). Therefore, unsupervised classification based on crop masks allows distinguishing and monitoring winter cereal crops, which were similar in terms of morphology and phenology.
Resumo:
This study compares the precision of three image classification methods, two of remote sensing and one of geostatistics applied to areas cultivated with citrus. The 5,296.52ha area of study is located in the city of Araraquara - central region of the state of São Paulo (SP), Brazil. The multispectral image from the CCD/CBERS-2B satellite was acquired in 2009 and processed through the Geographic Information System (GIS) SPRING. Three classification methods were used, one unsupervised (Cluster), and two supervised (Indicator Kriging/IK and Maximum Likelihood/Maxver), in addition to the screen classification taken as field checking.. Reliability of classifications was evaluated by Kappa index. In accordance with the Kappa index, the Indicator kriging method obtained the highest degree of reliability for bands 2 and 4. Moreover the Cluster method applied to band 2 (green) was the best quality classification between all the methods. Indicator Kriging was the classifier that presented the citrus total area closest to the field check estimated by -3.01%, whereas Maxver overestimated the total citrus area by 42.94%.
Resumo:
OBJECTIVE: to evaluate Crohn's disease recurrence and its possible predictors in patients undergoing surgical treatment. METHODS: We conducted a retrospective study with Crohn's disease (CD) patients undergoing surgical treatment between January 1992 and January 2012, and regularly monitored at the Bowel Clinic of the Hospital das Clínicas of the UFMG. RESULTS: we evaluated 125 patients, 50.4% female, with a mean age of 46.12 years, the majority (63.2%) diagnosed between 17 and 40 years of age. The ileum was involved in 58.4%, whereas stenotic behavior was observed in 44.8%, and penetrating, in 45.6%. We observed perianal disease in 26.4% of cases. The follow-up average was 152.40 months. Surgical relapse occurred in 29.6%, with a median time of 68 months from the first operation. CONCLUSION: The ileocolic location, penetrating behavior and perianal involvement (L3B3p) were associated with increased risk of surgical recurrence.
Resumo:
Gestational trophoblastic neoplasia (GTN) is the term to describe a set of malignant placental diseases, including invasive mole, choriocarcinoma, placental site trophoblastic tumor and epithelioid trophoblastic tumor. Both invasive mole and choriocarcinoma respond well to chemotherapy, and cure rates are greater than 90%. Since the advent of chemotherapy, low-risk GTN has been treated with a single agent, usually methotrexate or actinomycin D. Cases of high-risk GTN, however, should be treated with multiagent chemotherapy, and the regimen usually selected is EMA-CO, which combines etoposide, methotrexate, actinomycin D, cyclophosphamide and vincristine. This study reviews the literature about GTN to discuss current knowledge about its diagnosis and treatment.
Resumo:
Paratuberculosis is an important enteritis of ruminants caused by Mycobacterium avium subsp. paratuberculosis (Map). The disease is officially considered exotic in Brazil, but recent serological surveys and the isolation of the agent suggest it may occur in our herds. The aim of this study was to evaluate three different formulations of Herrold's egg yolk agar with mycobactin J (HEYM) and four faecal culture protocols considering their ability for Map growth as well as cost and ease of application. Three formulations of HEYM were inoculated with two suspensions of Map. Spiked faeces and naturally contaminated faecal samples were treated by the four faecal culture protocols. Centrifugation protocol and HEYM recommended by OIE showed the best results on the recovery of Map.
Resumo:
Avian pathogenic Escherichia coli (APEC) is responsible for various pathological processes in birds and is considered as one of the principal causes of morbidity and mortality, associated with economic losses to the poultry industry. The objective of this study was to demonstrate that it is possible to predict antimicrobial resistance of 256 samples (APEC) using 38 different genes responsible for virulence factors, through a computer program of artificial neural networks (ANNs). A second target was to find the relationship between (PI) pathogenicity index and resistance to 14 antibiotics by statistical analysis. The results showed that the RNAs were able to make the correct classification of the behavior of APEC samples with a range from 74.22 to 98.44%, and make it possible to predict antimicrobial resistance. The statistical analysis to assess the relationship between the pathogenic index (PI) and resistance against 14 antibiotics showed that these variables are independent, i.e. peaks in PI can happen without changing the antimicrobial resistance, or the opposite, changing the antimicrobial resistance without a change in PI.
Resumo:
Differences in age and sex distribution as well as FAB (French-American-British classification) types have been reported for acute leukemias in several countries. We studied the demographics and response to treatment of patients with acute myeloid leukemia (AML) and acute lymphoblastic leukemia (ALL) between 1989 and 2000 in Teresina, Piauí, and compared these results with reports from Brazil and other countries. Complete data concerning 345 patients (230 ALL, 115 AML) were reviewed. AML occurred predominantly in adults (77%), with a median age of 34 years, similar to that found in the southeast of Brazil but lower than the median age in the United States and Europe (52 years). FAB distribution was similar in children and adults and FAB-M2 was the most common type, as also found in Japan. The high frequency of FAB-M3 described in most Brazilian studies and for Hispanics in the United States was not observed. Overall survival for adults was 40%, similar to other studies in Brazil. A high mortality rate was observed during induction. No clinical or hematological parameter influenced survival in the Cox model. ALL presented the characteristic peak of incidence between 2-8 years. Most of the cases were CD10+ pre-B ALL. In 25%, abnormal expression of myeloid antigens was observed. Only 10% of the patients were older than 30 years. Overall survival was better for children. Age and leukocyte count were independent prognostic factors. These data demonstrate that, although there are regional peculiarities, the application of standardized treatments and good supportive care make it possible to achieve results observed in other countries for the same chemotherapy protocols.
Resumo:
The predominant type of liver alteration in asymptomatic or oligosymptomatic chronic male alcoholics (N = 169) admitted to a psychiatric hospital for detoxification was classified by two independent methods: liver palpation and multiple quadratic discriminant analysis (QDA), the latter applied to two parameters reported by the patient (duration of alcoholism and daily amount ingested) and to the data obtained from eight biochemical blood determinations (total bilirubin, alkaline phosphatase, glycemia, potassium, aspartate aminotransferase, albumin, globulin, and sodium). All 11 soft and sensitive, and 13 firm and sensitive livers formed fully concordant groups as determined by QDA. Among the 22 soft and not sensitive livers, 95% were concordant by QDA grouping. Concordance rates were low (55%) in the 73 firm and not sensitive livers, and intermediate (76%) in the 50 not palpable livers. Prediction of the liver palpation characteristics by QDA was 95% correct for the firm and not sensitive livers and moderate for the other groups. On a preliminary basis, the variables considered to be most informative by QDA were the two anamnestic data and bilirubin levels, followed by alkaline phosphatase, glycemia and potassium, and then by aspartate aminotransferase and albumin. We conclude that, when biopsies would be too costly or potentially injurious to the patients to varying extents, clinical data could be considered valid to guide patient care, at least in the three groups (soft, not sensitive; soft, sensitive; firm, sensitive livers) in which the two noninvasive procedures were highly concordant in the present study.
Resumo:
The total number of CD34+ cells is the most relevant clinical parameter when selecting human umbilical cord blood (HUCB) for transplantation. The objective of the present study was to compare the two most commonly used CD34+ cell quantification methods (ISHAGE protocol and ProCount™ - BD) and analyze the CD34+ bright cells whose 7-amino actinomycin D (7AAD) analysis suggests are apoptotic or dead cells. Twenty-six HUCB samples obtained at the Placental Blood Program of New York Blood Center were evaluated. The absolute numbers of CD34+ cells evaluated by the ISHAGE (with exclusion of 7AAD+ cells) and ProCount™ (with exclusion of CD34+ bright cells) were determined. Using the ISHAGE protocol we found 35.6 ± 19.4 CD34+ cells/µL and with the ProCount™ method we found 36.6 ± 23.2 CD34+ cells/µL. With the ProCount™ method, CD34+ bright cell counts were 9.3 ± 8.2 cells/µL. CD34+ bright and regular cells were individually analyzed by the ISHAGE protocol. Only about 1.8% of the bright CD34+ cells are alive, whereas a small part (19.0%) is undergoing apoptosis and most of them (79.2%) are dead cells. Our study showed that the two methods produced similar results and that 7AAD is important to exclude CD34 bright cells. These results will be of value to assist in the correct counting of CD34+ cells and to choose the best HUCB unit for transplantation, i.e., the unit with the greatest number of potentially viable stem cells for the reconstitution of bone marrow. This increases the likelihood of success of the transplant and, therefore, the survival of the patient.
Resumo:
The authors propose a clinical classification to monitor the evolution of tetanus patients, ranging from grade I to IV according to severity. It was applied on admission and repeated on alternate days up to the 10th day to patients aged > or = 12 years admitted to the State University Hospital, Recife, Brazil. Patients were also classified upon admission according to three prognostic indicators to determine if the proposed classification is in agreement with the traditionally used indicators. Upon admission, the distribution of the 64 patients among the different levels of the proposed classification was similar for the groups of better and worse prognosis according to the three indicators (P > 0.05), most of the patients belonging to grades I and II of the proposed classification. In the later reclassifications, severe forms of tetanus (grades III and IV) were more frequent in the categories of worse prognosis and these differences were statistically significant. There was a reduction in the proportion of mild forms (grades I and II) of tetanus with time for the categories of worse prognostic indicators (chi-square for trend: P = 0.00006, 0.03, and 0.00000) whereas no such trend was observed for the categories of better prognosis (grades I and II). This serially used classification reflected the prognosis of the traditional indicators and permitted the comparison of the dynamics of the disease in different groups. Thus, it becomes a useful tool for monitoring patients by determining clinical category changes with time, and for assessing responses to different therapeutic measures.
Resumo:
Several studies of the quantitative relationship between sodium need and sodium intake in rats are reviewed. Using acute diuretic treatment 24 h beforehand, intake matches need fairly accurately when intake is spread out in time by using a hypotonic solution of NaCl. In contrast, using a hypertonic solution, intake is typically double the need. Using the same diuretic treatment, although the natriuresis occurs within ~1 h, the appetite appears only slowly over 24 h. Increased plasma levels of aldosterone parallel the increased intake; however, treatment with metyrapone blocks the rise in aldosterone but has no effect on appetite. Satiation of sodium appetite was studied in rats using sodium loss induced by chronic diuretic treatment and daily salt consumption sessions. When a simulated foraging cost was imposed on NaCl access in the form of a progressive ratio lever press task, rats showed satiation for NaCl (break point) after consuming an amount close to their estimated deficit. The chronic diuretic regimen produced hypovolemia and large increases in plasma aldosterone concentration and renin activity. These parameters were reversed to or toward non-depleted control values at the time of behavioral satiation in the progressive ratio protocol. Satiation mechanisms for sodium appetite thus do appear to exist. However, they do not operate quantitatively when concentrated salt is available at no effort, but instead allow overconsumption. There are reasons to believe that such a bias toward overconsumption may have been beneficial over evolutionary time, but such biasing for salt and other commodities is maladaptive in a resource-rich environment.
Resumo:
The aim of this study was to analyze clinical aspects, hearing evolution and efficacy of clinical treatment of patients with sudden sensorineural hearing loss (SSNHL). This was a prospective clinical study of 136 consecutive patients with SSNHL divided into three groups after diagnostic evaluation: patients with defined etiology (DE, N = 13, 10%), concurrent diseases (CD, N = 63, 46.04%) and idiopathic sudden sensorineural hearing loss (ISSHL, N = 60, 43.9%). Initial treatment consisted of prednisone and pentoxifylline. Clinical aspects and hearing evolution for up to 6 months were evaluated. Group CD comprised 73% of patients with metabolic decompensation in the initial evaluation and was significantly older (53.80 years) than groups DE (41.93 years) and ISSHL (39.13 years). Comparison of the mean initial and final hearing loss of the three groups revealed a significant hearing improvement for group CD (P = 0.001) and group ISSHL (P = 0.001). Group DE did not present a significant difference in thresholds. The clinical classification for SSNHL allows the identification of significant differences regarding age, initial and final hearing impairment and likelihood of response to therapy. Elevated age and presence of coexisting disease were associated with a greater initial hearing impact and poorer hearing recovery after 6 months. Patients with defined etiology presented a much more limited response to therapy. The occurrence of decompensated metabolic and cardiovascular diseases and the possibility of first manifestation of auto-immune disease and cerebello-pontine angle tumors justify an adequate protocol for investigation of SSNHL.
Resumo:
The objective of the present study was to evaluate the characteristics of acute kidney injury (AKI) in AIDS patients and the value of RIFLE classification for predicting outcome. The study was conducted on AIDS patients admitted to an infectious diseases hospital inBrazil. The patients with AKI were classified according to the RIFLE classification: R (risk), I (injury), F (failure), L (loss), and E (end-stage renal disease). Univariate and multivariate analyses were used to evaluate the factors associated with AKI. A total of 532 patients with a mean age of 35 ± 8.5 years were included in this study. AKI was observed in 37% of the cases. Patients were classified as "R" (18%), "I" (7.7%) and "F" (11%). Independent risk factors for AKI were thrombocytopenia (OR = 2.9, 95%CI = 1.5-5.6, P < 0.001) and elevation of aspartate aminotransferase (AST) (OR = 3.5, 95%CI = 1.8-6.6, P < 0.001). General mortality was 25.7% and was higher among patients with AKI (40.2 vs17%, P < 0.001). AKI was associated with death and mortality increased according to RIFLE classification - "R" (OR 2.4), "I" (OR 3.0) and "F" (OR 5.1), P < 0.001. AKI is a frequent complication in AIDS patients, which is associated with increased mortality. RIFLE classification is an important indicator of poor outcome for AIDS patients.
Resumo:
High resolution proton nuclear magnetic resonance spectroscopy (¹H MRS) can be used to detect biochemical changes in vitro caused by distinct pathologies. It can reveal distinct metabolic profiles of brain tumors although the accurate analysis and classification of different spectra remains a challenge. In this study, the pattern recognition method partial least squares discriminant analysis (PLS-DA) was used to classify 11.7 T ¹H MRS spectra of brain tissue extracts from patients with brain tumors into four classes (high-grade neuroglial, low-grade neuroglial, non-neuroglial, and metastasis) and a group of control brain tissue. PLS-DA revealed 9 metabolites as the most important in group differentiation: γ-aminobutyric acid, acetoacetate, alanine, creatine, glutamate/glutamine, glycine, myo-inositol, N-acetylaspartate, and choline compounds. Leave-one-out cross-validation showed that PLS-DA was efficient in group characterization. The metabolic patterns detected can be explained on the basis of previous multimodal studies of tumor metabolism and are consistent with neoplastic cell abnormalities possibly related to high turnover, resistance to apoptosis, osmotic stress and tumor tendency to use alternative energetic pathways such as glycolysis and ketogenesis.