986 resultados para Rainfall event classification
Resumo:
The brief interaction of precipitation with a forest canopy can create a high spatial variability of both throughfall and solute deposition. We hypothesized that (i) the variability in natural forest systems is high but depends on system-inherent stability, (ii) the spatial variability of solute deposition shows seasonal dynamics depending on the increase in rainfall frequency, and (iii) spatial patterns persist only in the short-term. The study area in the north-western Brazilian state of Rondonia is subject to a climate with a distinct wet and dry season. We collected rain and throughfall on an event basis during the early wet season (n = 14) and peak of the wet season (n = 14) and analyzed the samples for pH and concentrations of NH4+, Na+, K+, Ca2+ Mg2+,, Cl-, NO3-, SO42- and DOC. The coefficient 3 4 cient of variation for throughfall based on both sampling intervals was 29%, which is at the lower end of values reported from other tropical forest sites, but which is higher than in most temperate forests. Coefficients of variation of solute deposition ranged from 29% to 52%. This heterogeneity of solute deposition is neither particularly high nor particularly tow compared with a range of tropical and temperate forest ecosystems. We observed an increase in solute deposition variability with the progressing wet season, which was explained by a negative correlation between heterogeneity of solute deposition and antecedent dry period. The temporal stability of throughfall. patterns was Low during the early wet season, but gained in stability as the wet season progressed. We suggest that rapid plant growth at the beginning of the rainy season is responsible for the lower stability, whereas less vegetative activity during the later rainy season might favor the higher persistence of ""hot"" and ""cold"" spots of throughfall. quantities. The relatively high stability of throughfall patterns during later stages of the wet season may influence processes at the forest floor and in the soil. Solute deposition patterns showed less clear trends but all patterns displayed a short-term stability only. The weak stability of those patterns is apt to impede the formation of solute deposition -induced biochemical microhabitats in the soil. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Quality control of toys for avoiding children exposure to potentially toxic elements is of utmost relevance and it is a common requirement in national and/or international norms for health and safety reasons. Laser-induced breakdown spectroscopy (LIBS) was recently evaluated at authors` laboratory for direct analysis of plastic toys and one of the main difficulties for the determination of Cd. Cr and Pb was the variety of mixtures and types of polymers. As most norms rely on migration (lixiviation) protocols, chemometric classification models from LIBS spectra were tested for sampling toys that present potential risk of Cd, Cr and Pb contamination. The classification models were generated from the emission spectra of 51 polymeric toys and by using Partial Least Squares - Discriminant Analysis (PLS-DA), Soft Independent Modeling of Class Analogy (SIMCA) and K-Nearest Neighbor (KNN). The classification models and validations were carried out with 40 and 11 test samples, respectively. Best results were obtained when KNN was used, with corrected predictions varying from 95% for Cd to 100% for Cr and Pb. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Traditionally, chronotype classification is based on the Morningness-Eveningness Questionnaire (MEQ). It is implicit in the classification that intermediate individuals get intermediate scores to most of the MEQ questions. However, a small group of individuals has a different pattern of answers. In some questions, they answer as ""morning-types"" and in some others they answer as ""evening-types,"" resulting in an intermediate total score. ""Evening-type"" and ""Morning-type"" answers were set as A(1) and A(4), respectively. Intermediate answers were set as A(2) and A(3). The following algorithm was applied: Bimodality Index = (Sigma A(1) x Sigma A(4))(2) - (Sigma A(2) x Sigma A(3))(2). Neither-types that had positive bimodality scores were classified as bimodal. If our hypothesis is validated by objective data, an update of chronotype classification will be required. (Author correspondence: brunojm@ymail.com)
Resumo:
Aims. The aims of this study were to assess the internal reliability (internal consistency), construct validity, sensitivity and ceiling and floor effects of the Brazilian-Portuguese version of the Impact of Event Scale (IES). Design. Methodological research design. Method. The Brazilian-Portuguese version of the IES was applied to a group of 91 burned patients at three times: the first week after the burn injury (time one), between the fourth and the sixth months (time two) and between the ninth and the 12th months (time three). The internal consistency, construct validity (convergent and dimensionality), sensitivity and ceiling and floor effects were tested. Results. Cronbach`s alpha coefficients showed high internal consistency for the total scale (0 center dot 87) and for the domains intrusive thoughts (0 center dot 87) and avoidance responses (0 center dot 76). During the hospitalisation (time one), the scale showed low and positive correlations with pain measures immediately before (r = 0 center dot 22; p < 0 center dot 05) and immediately after baths and dressings (r = 0 center dot 21; p < 0 center dot 05). After the discharge, we found strong and negative correlations with self-esteem (r = -0 center dot 52; p < 0 center dot 01), strong and positive with depression (r = 0 center dot 63; p < 0 center dot 01) and low and negative with the Bodily pain (r = -0 center dot 24; p < 0 center dot 05), Social functioning (r = -0 center dot 34; p < 0 center dot 01) and Mental health (r = -0 center dot 27; p < 0 center dot 05) domains of the SF-36 at time two. Regarding the sensitivity, no statistically significant differences were observed between mean scale scores according to burned body surface (p = 0 center dot 21). The floor effect was observed in most of the IES items. Conclusion. The adapted version of the scale showed to be reliable and valid to assess postburn reactions on the impact of the event in the group of patients under analysis. Relevance to clinical practice. The Impact of Event Scale can be used in research and clinical practice to assess nursing interventions aimed at decreasing stress during rehabilitation.
Resumo:
Oropharyngeal dysphagia is characterized by any alteration in swallowing dynamics which may lead to malnutrition and aspiration pneumonia. Early diagnosis is crucial for the prognosis of patients with dysphagia, and the best method for swallowing dynamics assessment is swallowing videofluoroscopy, an exam performed with X-rays. Because it exposes patients to radiation, videofluoroscopy should not be performed frequently nor should it be prolonged. This study presents a non-invasive method for the pre-diagnosis of dysphagia based on the analysis of the swallowing acoustics, where the discrete wavelet transform plays an important role to increase sensitivity and specificity in the identification of dysphagic patients. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Despite modern weed control practices, weeds continue to be a threat to agricultural production. Considering the variability of weeds, a classification methodology for the risk of infestation in agricultural zones using fuzzy logic is proposed. The inputs for the classification are attributes extracted from estimated maps for weed seed production and weed coverage using kriging and map analysis and from the percentage of surface infested by grass weeds, in order to account for the presence of weed species with a high rate of development and proliferation. The output for the classification predicts the risk of infestation of regions of the field for the next crop. The risk classification methodology described in this paper integrates analysis techniques which may help to reduce costs and improve weed control practices. Results for the risk classification of the infestation in a maize crop field are presented. To illustrate the effectiveness of the proposed system, the risk of infestation over the entire field is checked against the yield loss map estimated by kriging and also with the average yield loss estimated from a hyperbolic model.
Resumo:
The properties of recycled aggregate produced from mixed (masonry and concrete) construction and demolition (C&D) waste are highly variable, and this restricts the use of such aggregate in structural concrete production. The development of classification techniques capable of reducing this variability is instrumental for quality control purposes and the production of high quality C&D aggregate. This paper investigates how the classification of C&D mixed coarse aggregate according to porosity influences the mechanical performance of concrete. Concretes using a variety of C&D aggregate porosity classes and different water/cement ratios were produced and the mechanical properties measured. For concretes produced with constant volume fractions of water, cement, natural sand and coarse aggregate from recycled mixed C&D waste, the compressive strength and Young modulus are direct exponential functions of the aggregate porosity. Sink and float technique is a simple laboratory density separation tool that facilitates the separation of cement particles with lower porosity, a difficult task when done only by visual sorting. For this experiment, separation using a 2.2 kg/dmA(3) suspension produced recycled aggregate (porosity less than 17%) which yielded good performance in concrete production. Industrial gravity separators may lead to the production of high quality recycled aggregate from mixed C&D waste for structural concrete applications.
Resumo:
Urban rainfall-runoff residuals contain metals such as Cr, Zn, Cu, As, Pb and Cd and are thus reasonable candidates for treatment using Portland cement-based solidification-stabilization (S/S). This research is a study of S/S of urban storm water runoff solid residuals in Portland cement with quicklime and sodium bentonite additives. The solidified residuals were analyzed after 28 days of hydration time using X-ray powder diffraction (XRD) and solid-state Si-29 nuclear magnetic resonance (NMR) spectroscopy. X-ray diffraction (XRD) results indicate that the main cement hydration products are ettringite, calcium hydroxide and hydrated calcium silicates. Zinc hydroxide and lead and zinc silicates are also present due to the reactions of the waste compounds with the cement and its hydration products. Si-29 NMR analysis shows that the coarse fraction of the waste apparently does not interfere with cement hydration, but the fine fraction retards silica polymerization.
Diagnostic errors and repetitive sequential classifications in on-line process control by attributes
Resumo:
The procedure of on-line process control by attributes, known as Taguchi`s on-line process control, consists of inspecting the mth item (a single item) at every m produced items and deciding, at each inspection, whether the fraction of conforming items was reduced or not. If the inspected item is nonconforming, the production is stopped for adjustment. As the inspection system can be subject to diagnosis errors, one develops a probabilistic model that classifies repeatedly the examined item until a conforming or b non-conforming classification is observed. The first event that occurs (a conforming classifications or b non-conforming classifications) determines the final classification of the examined item. Proprieties of an ergodic Markov chain were used to get the expression of average cost of the system of control, which can be optimized by three parameters: the sampling interval of the inspections (m); the number of repeated conforming classifications (a); and the number of repeated non-conforming classifications (b). The optimum design is compared with two alternative approaches: the first one consists of a simple preventive policy. The production system is adjusted at every n produced items (no inspection is performed). The second classifies the examined item repeatedly r (fixed) times and considers it conforming if most classification results are conforming. Results indicate that the current proposal performs better than the procedure that fixes the number of repeated classifications and classifies the examined item as conforming if most classifications were conforming. On the other hand, the preventive policy can be averagely the most economical alternative rather than those ones that require inspection depending on the degree of errors and costs. A numerical example illustrates the proposed procedure. (C) 2009 Elsevier B. V. All rights reserved.
Resumo:
Objective To describe onset features, classification and treatment of juvenile dermatomyositis (JDM) and juvenile polymyositis (JPM) from a multicentre registry. Methods Inclusion criteria were onset age lower than 18 years and a diagnosis of any idiopathic inflammatory myopathy (IIM) by attending physician. Bohan & Peter (1975) criteria categorisation was established by a scoring algorithm to define JDM and JPM based oil clinical protocol data. Results Of the 189 cases included, 178 were classified as JDM, 9 as JPM (19.8: 1) and 2 did not fit the criteria; 6.9% had features of chronic arthritis and connective tissue disease overlap. Diagnosis classification agreement occurred in 66.1%. Medial? onset age was 7 years, median follow-up duration was 3.6 years. Malignancy was described in 2 (1.1%) cases. Muscle weakness occurred in 95.8%; heliotrope rash 83.5%; Gottron plaques 83.1%; 92% had at least one abnormal muscle enzyme result. Muscle biopsy performed in 74.6% was abnormal in 91.5% and electromyogram performed in 39.2% resulted abnormal in 93.2%. Logistic regression analysis was done in 66 cases with all parameters assessed and only aldolase resulted significant, as independent variable for definite JDM (OR=5.4, 95%CI 1.2-24.4, p=0.03). Regarding treatment, 97.9% received steroids; 72% had in addition at least one: methotrexate (75.7%), hydroxychloroquine (64.7%), cyclosporine A (20.6%), IV immunoglobulin (20.6%), azathioprine (10.3%) or cyclophosphamide (9.6%). In this series 24.3% developed calcinosis and mortality rate was 4.2%. Conclusion Evaluation of predefined criteria set for a valid diagnosis indicated aldolase as the most important parameter associated with de, methotrexate combination, was the most indicated treatment.
Resumo:
The objective of this work was to carry a descriptive analysis in the monthly precipitation of rainfall stations from Rio de Janeiro State, Brazil, using data of position and dispersion and graphical analyses, and to verify the presence of seasonality and trend in these data, with a study about the application of models of time series. The descriptive statistics was to characterize the general behavior of the series in three stations selected which present consistent historical series. The methodology of analysis of variance in randomized blocks and the determination of models of multiple linear regression, considering years and months as predictors variables, disclosed the presence of seasonality, what allowed to infer on the occurrence of repetitive natural phenomena throughout the time and absence of trend in the data. It was applied the methodology of multiple linear regression to removal the seasonality of these time series. The original data had been deducted from the estimates made by the adjusted model and the analysis of variance in randomized blocks for the residues of regression was preceded again. With the results obtained it was possible to conclude that the monthly rainfall present seasonality and they don`t present trend, the analysis of multiple regression was efficient in the removal of the seasonality, and the rainfall can be studied by means of time series.
Resumo:
Oxidative stress is a physiological condition that is associated with atherosclerosis. and it can be influenced by diet. Our objective was to group fifty-seven individuals with dyslipidaemia controlled by statins according to four oxidative biomarkers, and to evaluate the diet pattern and blood biochemistry differences between these groups. Blood samples were collected and the following parameters were evaluated: diet intake; plasma fatty acids; lipoprotein concentration; glucose; oxidised LDL (oxLDL); malondialdehyde (MDA): total antioxidant activity by 2,2-diphenyl-1-picrylhydrazyl (DPPH) and ferric reducing ability power assays. Individuals were separated into five groups by cluster analysis. All groups showed a difference with respect to at least one of the four oxidative stress biomarkers. The separation of individuals in the first axis was based upon their total antioxidant activity. Clusters located on the right side showed higher total antioxidant activity, higher myristic fatty acid and lower arachidonic fatty acid proportions than clusters located on the left side. A negative correlation was observed between DPPH and the peroxidability index. The second axis showed differences in oxidation status as measured by MDA and oxLDL concentrations. Clusters located on the Upper side showed higher oxidative status and lower HDL cholesterol concentration than clusters located on the lower side. There were no differences in diet among the five clusters. Therefore, fatty acid synthesis and HDL cholesterol concentration seem to exert a more significant effect on the oxidative conditions of the individuals with dyslipidaemia controlled by statins than does their food intake.
Resumo:
The Biopharmaceutics Classification System (BCS) is a tool that was created to categorize drugs into different groups according to their solubility and permeability characteristics. Through a combination of these factors and physiological parameters, it is possible to understand the absorption behavior of a drug in the gastrointestinal tract, thus contributing to cost and time reductions in drug development, as well as reducing exposure of human subjects during in vivo trials. Solubility is attained by determining the equilibrium under conditions of physiological pH, while different methods may be employed for evaluating permeability. On the other hand, the intrinsic dissolution rate (IDR), which is defined as the rate of dissolution of a pure substance under constant temperature, pH, and surface area conditions, among others, may present greater correlation to the in vivo dissolution dynamic than the solubility test. The purpose of this work is to discuss the intrinsic dissolution test as a tool for determining the solubility of drugs within the scope of the Biopharmaceutics Classification System (BCS).
Resumo:
A chemotaxonomic analysis is described of a database containing various types of compounds from the Heliantheae tribe (Asteraceae) using Self-Organizing Maps (SOM). The numbers of occurrences of 9 chemical classes in different taxa of the tribe were used as variables. The study shows that SOM applied to chemical data can contribute to differentiate genera, subtribes, and groups of subtribes (subtribe branches), as well as to tribal and subtribal classifications of Heliantheae, exhibiting a high hit percentage comparable to that of an expert performance, and in agreement with the previous tribe classification proposed by Stuessy.