975 resultados para literature-data integration
Resumo:
This paper presents results of laboratory testing of unrestrained drying shrinkage during a period of 154 days of different concrete mixtures from the Brazilian production line that utilize ground granulated blast-furnace slag in their compositions. Three concrete mixtures with water/cement ratio of 0.78(M1), 0.41(M2), and 0.37(M3) were studied. The obtained experimental data were compared with the analytical results from prediction models available in the literature: the ACI 209 model (ACI), the B3 model (B3), the Eurocode 2 model (EC2), the GL 2000 model (GL), and the Brazilian NBR 6118 model (NBR), and an analysis of the efficacy of these models was conducted utilizing these experimental data. In addition, the development of the mechanical properties (compressive strength and modulus of elasticity) of the studied concrete mixtures was also measured in the laboratory until 126 days. From this study, it could be concluded that the ACI and the GL were the models that most approximated the experimental drying shrinkage data measured during the analyzed period of time.
Resumo:
Oxidation processes can be used to treat industrial wastewater containing non-biodegradable organic compounds. However, the presence of dissolved salts may inhibit or retard the treatment process. In this study, wastewater desalination by electrodialysis (ED) associated with an advanced oxidation process (photo-Fenton) was applied to an aqueous NaCl solution containing phenol. The influence of process variables on the demineralization factor was investigated for ED in pilot scale and a correlation was obtained between the phenol, salt and water fluxes with the driving force. The oxidation process was investigated in a laboratory batch reactor and a model based on artificial neural networks was developed by fitting the experimental data describing the reaction rate as a function of the input variables. With the experimental parameters of both processes, a dynamic model was developed for ED and a continuous model, using a plug flow reactor approach, for the oxidation process. Finally, the hybrid model simulation could validate different scenarios of the integrated system and can be used for process optimization.
Resumo:
In the context of cancer diagnosis and treatment, we consider the problem of constructing an accurate prediction rule on the basis of a relatively small number of tumor tissue samples of known type containing the expression data on very many (possibly thousands) genes. Recently, results have been presented in the literature suggesting that it is possible to construct a prediction rule from only a few genes such that it has a negligible prediction error rate. However, in these results the test error or the leave-one-out cross-validated error is calculated without allowance for the selection bias. There is no allowance because the rule is either tested on tissue samples that were used in the first instance to select the genes being used in the rule or because the cross-validation of the rule is not external to the selection process; that is, gene selection is not performed in training the rule at each stage of the cross-validation process. We describe how in practice the selection bias can be assessed and corrected for by either performing a cross-validation or applying the bootstrap external to the selection process. We recommend using 10-fold rather than leave-one-out cross-validation, and concerning the bootstrap, we suggest using the so-called. 632+ bootstrap error estimate designed to handle overfitted prediction rules. Using two published data sets, we demonstrate that when correction is made for the selection bias, the cross-validated error is no longer zero for a subset of only a few genes.
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.
Resumo:
There is a widely held paradigm that mangroves are critical for sustaining production in coastal fisheries through their role as important nursery areas for fisheries species. This paradigm frequently forms the basis for important management decisions on habitat conservation and restoration of mangroves and other coastal wetlands. This paper reviews the current status of the paradigm and synthesises the information on the processes underlying these potential links. In the past, the paradigm has been supported by studies identifying correlations between the areal and linear extent of mangroves and fisheries catch. This paper goes beyond the correlative approach to develop a new framework on which future evaluations can be based. First, the review identifies what type of marine animals are using mangroves and at what life stages. These species can be categorised as estuarine residents, marine-estuarine species and marine stragglers. The marine-estuarine category includes many commercial species that use mangrove habitats as nurseries. The second stage is to determine why these species are using mangroves as nurseries. The three main proposals are that mangroves provide a refuge from predators, high levels of nutrients and shelter from physical disturbances. The recognition of the important attributes of mangrove nurseries then allows an evaluation of how changes in mangroves will affect the associated fauna. Surprisingly few studies have addressed this question. Consequently, it is difficult to predict how changes in any of these mangrove attributes would affect the faunal communities within them and, ultimately, influence the fisheries associated with them. From the information available, it seems likely that reductions in mangrove habitat complexity would reduce the biodiversity and abundance of the associated fauna, and these changes have the potential to cause cascading effects at higher trophic levels with possible consequences for fisheries. Finally, there is a discussion of the data that are currently available on mangrove distribution and fisheries catch, the limitations of these data and how best to use the data to understand mangrove-fisheries links and, ultimately, to optimise habitat and fisheries management. Examples are drawn from two relatively data-rich regions, Moreton Bay (Australia) and Western Peninsular Malaysia, to illustrate the data needs and research requirements for investigating the mangrove-fisheries paradigm. Having reliable and accurate data at appropriate spatial and temporal scales is crucial for mangrove-fisheries investigations. Recommendations are made for improvements to data collection methods that would meet these important criteria. This review provides a framework on which to base future investigations of mangrove-fisheries links, based on an understanding of the underlying processes and the need for rigorous data collection. Without this information, the understanding of the relationship between mangroves and fisheries will remain limited. Future investigations of mangrove-fisheries links must take this into account in order to have a good ecological basis and to provide better information and understanding to both fisheries and conservation managers.
Resumo:
Dherte PM, Negrao MPG, Mori Neto S, Holzhacker R, Shimada V, Taberner P, Carmona MJC - Smart Alerts: Development of a Software to Optimize Data Monitoring. Background and objectives: Monitoring is useful for vital follow-ups and prevention, diagnosis, and treatment of several events in anesthesia. Although alarms can be useful in monitoring they can cause dangerous user`s desensitization. The objective of this study was to describe the development of specific software to integrate intraoperative monitoring parameters generating ""smart alerts"" that can help decision making, besides indicating possible diagnosis and treatment. Methods: A system that allowed flexibility in the definition of alerts, combining individual alarms of the parameters monitored to generate a more elaborated alert system was designed. After investigating a set of smart alerts, considered relevant in the surgical environment, a prototype was designed and evaluated, and additional suggestions were implemented in the final product. To verify the occurrence of smart alerts, the system underwent testing with data previously obtained during intraoperative monitoring of 64 patients. The system allows continuous analysis of monitored parameters, verifying the occurrence of smart alerts defined in the user interface. Results: With this system a potential 92% reduction in alarms was observed. We observed that in most situations that did not generate alerts individual alarms did not represent risk to the patient. Conclusions: Implementation of software can allow integration of the data monitored and generate information, such as possible diagnosis or interventions. An expressive potential reduction in the amount of alarms during surgery was observed. Information displayed by the system can be oftentimes more useful than analysis of isolated parameters.
Resumo:
Objective. The purpose of this study was to estimate the Down syndrome detection and false-positive rates for second-trimester sonographic prenasal thickness (PT) measurement alone and in combination with other markers. Methods. Multivariate log Gaussian modeling was performed using numerical integration. Parameters for the PT distribution, in multiples of the normal gestation-specific median (MoM), were derived from 105 Down syndrome and 1385 unaffected pregnancies scanned at 14 to 27 weeks. The data included a new series of 25 cases and 535 controls combined with 4 previously published series. The means were estimated by the median and the SDs by the 10th to 90th range divided by 2.563. Parameters for other markers were obtained from the literature. Results. A log Gaussian model fitted the distribution of PT values well in Down syndrome and unaffected pregnancies. The distribution parameters were as follows: Down syndrome, mean, 1.334 MoM; log(10) SD, 0.0772; unaffected pregnancies, 0.995 and 0.0752, respectively. The model-predicted detection rates for 1%, 3%, and 5% false-positive rates for PT alone were 35%, 51%, and 60%, respectively. The addition of PT to a 4 serum marker protocol increased detection by 14% to 18% compared with serum alone. The simultaneous sonographic measurement of PT and nasal bone length increased detection by 19% to 26%, and with a third sonographic marker, nuchal skin fold, performance was comparable with first-trimester protocols. Conclusions. Second-trimester screening with sonographic PT and serum markers is predicted to have a high detection rate, and further sonographic markers could perform comparably with first-trimester screening protocols.
Resumo:
BACKGROUND: Chromoblastomycosis is a subcutaneous mycosis that occurs mainly in rural workers although is being more commonly found among people working in other sectors. The fungus penetrates the skin after its inoculation and the most frequently isolated agent is the Fonsecaea pedrosoi. OBJECTIVES: This study aims at evaluating patients suffering from chromoblastomycosis admitted into the Department of Dermatology of the University Hospital of the Faculty of Medicine of Sao Paulo State during the ten-year period from 1997 to 2007. METHODS: It is a retrospective study and the medical report cards of 27 Brazilian patients diagnosed as suffering from Chromoblastomycosis from 1997 to 2007 at the Dermatology Department of the Medical School, University of Sao Paulo were reviewed. The following items were analyzed: previous therapeutic approaches; treatment implemented by the group; length of time between the appearing of the lesion and diagnosis; age; gender; profession; origin; site of lesions; isolated agents found in culture and histopathology. RESULTS: Twenty two patients were from the state of Sao Paulo whereas the others came from the states of Bahia and Rondonia. 37% of them were rural workers. Men were more frequently infected (85%). Lesions were more commonly found on the lower limbs (59.2%). In 52% of the cases the isolated agent was the dematiaceous fungus Fonsecaea. pedrosoi. Biopsies showed sclerotic bodies in 92.5% of the cases. CONCLUSION: Data found are in accordance with medical literature on the subject. The disease had been previously studied in our institution in 1983 by Cuce et al. This present study is the second retrospective one about the characteristics of patients suffering from chromoblastmycosis which has been published in indexed medical literature in the state of Sao Paulo.
Resumo:
Plant-antivenom is a computational Websystem about medicinal plants with anti-venom properties. The system consists of a database of these plants, including scientific publications on this subject and amino acid sequences of active principles from venomous animals. The system relates these data allowing their integration through different search applications. For the development of the system, the first surveys were conducted in scientific literature, allowing the creation of a publication database in a library for reading and user interaction. Then, classes of categories were created, allowing the use of tags and the organization of content. This database on medicinal plants has information such as family, species, isolated compounds, activity, inhibited animal venoms, among others. Provision is made for submission of new information by registered users, by the use of wiki tools. Content submitted is released in accordance to permission rules defined by the system. The database on biological venom protein amino acid sequences was structured from the essential information from National Center for Biotechnology Information (NCBI). Plant-antivenom`s interface is simple, contributing to a fast and functional access to the system and the integration of different data registered on it. Plant-antivenom system is available on the Internet at http://gbi.fmrp.usp.br/plantantivenom.
Resumo:
The literature shows contradictory results regarding the role of composite shrinkage and elastic modulus as determinants of polymerization stress. The present study aimed at a better understanding of the test mechanics that could explain such divergences among studies. The hypothesis was that the effects of composite shrinkage and elastic modulus on stress depend upon the compliance of the testing system. A commonly used test apparatus was simulated by finite element analysis, with different compliance levels defined by the bonding substrate (steel, glass, composite, or acrylic). Composites with moduli between 1 and 12 GPa and shrinkage values between 0.5% and 6% were modeled. Shrinkage was simulated by thermal analogy. The hypothesis was confirmed. When shrinkage and modulus increased simultaneously, stress increased regardless of the substrate. However, if shrinkage and modulus were inversely related, their magnitudes and interaction with rod material determined the stress response.
Resumo:
Objective. The goal of this paper is to undertake a literature search collecting all dentin bond strength data obtained for six adhesives with four tests ( shear, microshear, tensile and microtensile) and to critically analyze the results with respect to average bond strength, coefficient of variation, mode of failure and product ranking. Method. A PubMed search was carried out for the years between 1998 and 2009 identifying publications on bond strength measurements of resin composite to dentin using four tests: shear, tensile, microshear and microtensile. The six adhesive resins were selected covering three step systems ( OptiBond FL, Scotch Bond Multi-Purpose Plus), two-step (Prime & Bond NT, Single Bond, Clear. l SE Bond) and one step (Adper Prompt L Pop). Results. Pooling results from 147 references showed an ongoing high scatter in the bond strength data regardless which adhesive and which bond test was used. Coefficients of variation remained high (20-50%) even with the microbond test. The reported modes of failure for all tests still included high number of cohesive failures. The ranking seemed to be dependant on the test used. Significance. The scatter in dentin bond strength data remains regardless which test is used confirming Finite Element Analysis predicting non-uniform stress distributions due to a number of geometrical, loading, material properties and specimens preparation variables. This reopens the question whether, an interfacial fracture mechanics approach to analyze the dentin - adhesive bond is not more appropriate for obtaining better agreement among dentin bond related papers. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Examined the barriers faced by people with Spinal Cord Injuries (SCI) when integrating their Assistive Technology (AT) into the workplace, as well as factors that contribute to successful integration. In-depth interviews were taken with 5 men (aged 37-50 yrs) with SCI, 3 of their employers and 2 co-workers. Results indicate that in addition to the barriers previously outlined in the literature related to funding the technology, time delays, information availability, training and maintenance, other issues were highlighted. Implications for service providers are considered in relation to these barriers and the factors that prompted successful integration. The author discusses limitations of the study and makes recommendations for future research. (PsycINFO Database Record (c) 2007 APA, all rights reserved)
Resumo:
Objective: To test the feasibility of an evidence-based clinical literature search service to help answer general practitioners' (GPs') clinical questions. Design: Two search services supplied GPs who submitted questions with the best available empirical evidence to answer these questions. The GPs provided feedback on the value of the service, and concordance of answers from the two search services was assessed. Setting: Two literature search services (Queensland and Victoria), operating for nine months from February 1999. Main outcome measures: Use of the service; time taken to locate answers; availability of evidence; value of the service to GPs; and consistency of answers from the two services. Results: 58 GPs asked 160 questions (29 asked one, 11 asked five or more). The questions concerned treatment (65%), aetiology (17%), prognosis (13%), and diagnosis (5%). Answering a question took a mean of 3 hours 32 minutes of personnel time (95% Cl, 2.67-3.97); nine questions took longer than 10 hours each to answer, the longest taking 23 hours 30 minutes. Evidence of suitable quality to provide a sound answer was available for 126 (79%) questions. Feedback data for 84 (53%) questions, provided by 42 GPs, showed that they appreciated the service, and asking the questions changed clinical care. There were many minor differences between the answers from the two centres, and substantial differences in the evidence found for 4/14 questions. However, conclusions reached were largely similar, with no or only minor differences for all questions. Conclusions: It is feasible to provide a literature search service, but further assessment is needed to establish its cost effectiveness.
Resumo:
Outcome after traumatic brain injury (TBI) is characterized by a high degree of variability which has often been difficult to capture in traditional outcome studies. The purpose of this study was to describe patterns of community integration 2-5 years after TBI. Participants were 208 patients admitted to a Brain Injury Rehabilitation Unit between 1991-1995 in Brisbane, Australia. The design comprised retrospective data collection and questionnaire follow-up by mail. Mean follow-up was 3.5 years. Demographic, injury severity and functional status variables were retrieved from hospital records. Community integration was assessed using the Community Integration Questionnaire (CIQ), and vocational status measured by a self administered questionnaire. Data was analysed using cluster analysis which divided the data into meaningful subsets. Based on the CIQ subscale scores of home, social and productive integration, a three cluster solution was selected, with groups labelled as working (n = 78), balanced (n = 46) and poorly integrated (n = 84). Although 38% of the sample returned to a high level of productive activity and 22% achieved a balanced lifestyle, overall community integration was poor for the remainder. This poorly integrated group had more severe injury characterized by longer periods of acute care and post-traumatic amnesia (PTA) and greater functional disability on discharge. These findings have implications for service delivery prior to and during the process of reintegration after brain injury.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.