862 resultados para Integrated user model
Resumo:
Aboveground tropical tree biomass and carbon storage estimates commonly ignore tree height (H). We estimate the effect of incorporating H on tropics-wide forest biomass estimates in 327 plots across four continents using 42 656 H and diameter measurements and harvested trees from 20 sites to answer the following questions: 1. What is the best H-model form and geographic unit to include in biomass models to minimise site-level uncertainty in estimates of destructive biomass? 2. To what extent does including H estimates derived in (1) reduce uncertainty in biomass estimates across all 327 plots? 3. What effect does accounting for H have on plot- and continental-scale forest biomass estimates? The mean relative error in biomass estimates of destructively harvested trees when including H (mean 0.06), was half that when excluding H (mean 0.13). Power- and Weibull-H models provided the greatest reduction in uncertainty, with regional Weibull-H models preferred because they reduce uncertainty in smaller-diameter classes (< 40 cm D) that store about one-third of biomass per hectare in most forests. Propagating the relationships from destructively harvested tree biomass to each of the 327 plots from across the tropics shows that including H reduces errors from 41.8 Mg ha(-1) (range 6.6 to 112.4) to 8.0 Mg ha(-1) (-2.5 to 23.0).
Resumo:
The aim of this work is to use GIS integration data to characterize sedimentary processes in a SubTropical lagoon environment. The study area was the Canan,ia Inlet estuary in the southeastern section of the Canan,ia Lagoon Estuarine System (CLES), state of So Paulo, Brazil (25A degrees 03'S/47A degrees 53'W). The area is formed by the confluence of two estuarine channels forming a bay-shaped water body locally called "Trapand, Bay". The region is surrounded by one of the most preserved tracts of Atlantic Rain Forest in Southwestern Brazil and presents well-developed mangroves and marshes. In this study a methodology was developed using integrated a GIS database based on bottom sediment parameters, geomorphological data, remote sensing images, Hidrodynamical Modeling data and geophysical parameters. The sediment grain size parameters and the bottom morphology of the lagoon were also used to develop models of net sediment transport pathways. It was possible to observe that the sediment transport vectors based on the grain size model had a good correlation with the transport model based on the bottom topography features and Hydrodynamic model, especially in areas with stronger energetic conditions, with a minor contribution of finer sediments. This relation is somewhat less evident near shallower banks and depositional features. In these regions the organic matter contents in the sediments was a good complementary tool for inferring the hydrodynamic and depositional conditions (i.e. primary productivity, sedimentation rates, sources, oxi-reduction rates).
Resumo:
Objectives: To investigate the potential of an active attachment biofilm model as a highthroughput demineralization biofilm model for the evaluation of caries-preventive agents. Methods: Streptococcus mutans UA159 biofilms were grown on bovine dentine discs in a highthroughput active attachment model. Biofilms were first formed in a medium with high buffer capacity for 24 h and then subjected to various photodynamic therapies (PACT) using the combination of Light Emitting Diodes (LEDs, Biotable (R)) and Photogem (R). Viability of the biofilms was evaluated by plate counts. To investigate treatment effects on dentine lesion formation, the treated biofilms were grown in a medium with low buffer capacity for an additional 24 h. Integrated mineral loss (IML) and lesion depth (LD) were assessed by transversal microradiography. Calcium release in the biofilm medium was measured by atomic absorption spectroscopy. Results: Compared to the water treated control group, significant reduction in viability of S. mutans biofilms was observed when the combination of LEDs and Photogem (R) was applied. LEDs or Photogem (R) only did not result in biofilm viability changes. Similar outcomes were also found for dentine lesion formation. Significant lower IML and LD values were only found in the group subjected to the combined treatment of LEDs and Photogem (R). There was a good correlation between the calcium release data and the IML or LD values. Conclusions: The high-throughput active attachment biofilm model is applicable for evaluating novel caries-preventive agents on both biofilm and demineralization inhibition. PACT had a killing effect on 24 h S. mutans biofilms and could inhibit the demineralization process. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This article describes a real-world production planning and scheduling problem occurring at an integrated pulp and paper mill (P&P) which manufactures paper for cardboard out of produced pulp. During the cooking of wood chips in the digester, two by-products are produced: the pulp itself (virgin fibers) and the waste stream known as black liquor. The former is then mixed with recycled fibers and processed in a paper machine. Here, due to significant sequence-dependent setups in paper type changeovers, sizing and sequencing of lots have to be made simultaneously in order to efficiently use capacity. The latter is converted into electrical energy using a set of evaporators, recovery boilers and counter-pressure turbines. The planning challenge is then to synchronize the material flow as it moves through the pulp and paper mills, and energy plant, maximizing customer demand (as backlogging is allowed), and minimizing operation costs. Due to the intensive capital feature of P&P, the output of the digester must be maximized. As the production bottleneck is not fixed, to tackle this problem we propose a new model that integrates the critical production units associated to the pulp and paper mills, and energy plant for the first time. Simple stochastic mixed integer programming based local search heuristics are developed to obtain good feasible solutions for the problem. The benefits of integrating the three stages are discussed. The proposed approaches are tested on real-world data. Our work may help P&P companies to increase their competitiveness and reactiveness in dealing with demand pattern oscillations. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
In the clinical setting, the early detection of myocardial injury induced by doxorubicin (DXR) is still considered a challenge. To assess whether ultrasonic tissue characterization (UTC) can identify early DXR-related myocardial lesions and their correlation with collagen myocardial percentages, we studied 60 rats at basal status and prospectively after 2mg/Kg/week DXR endovenous infusion. Echocardiographic examinations were conducted at baseline and at 8,10,12,14 and 16 mg/Kg DXR cumulative dose. The left ventricle ejection fraction (LVEF), shortening fraction (SF), and the UTC indices: corrected coefficient of integrated backscatter (IBS) (tissue IBS intensity/phantom IBS intensity) (CC-IBS) and the cyclic variation magnitude of this intensity curve (MCV) were measured. The variation of each parameter of study through DXR dose was expressed by the average and standard error at specific DXR dosages and those at baseline. The collagen percent (%) was calculated in six control group animals and 24 DXR group animals. CC-IBS increased (1.29 +/- 0.27 x 1.1 +/- 0.26-basal; p=0.005) and MCV decreased (9.1 +/- 2.8 x 11.02 +/- 2.6-basal; p=0.006) from 8 mg/Kg to 16mg/Kg DXR. LVEF presented only a slight but significant decrease (80.4 +/- 6.9% x 85.3 +/- 6.9%-basal, p=0.005) from 8 mg/Kg to 16 mg/Kg DXR. CC-IBS was 72.2% sensitive and 83.3% specific to detect collagen deposition of 4.24%(AUC=0.76). LVEF was not accurate to detect initial collagen deposition (AUC=0.54). In conclusion: UTC was able to early identify the DXR myocardial lesion when compared to LVEF, showing good accuracy to detect the initial collagen deposition in this experimental animal model.
Resumo:
Aircraft composite structures must have high stiffness and strength with low weight, which can guarantee the increase of the pay-load for airplanes without losing airworthiness. However, the mechanical behavior of composite laminates is very complex due the inherent anisotropy and heterogeneity. Many researchers have developed different failure progressive analyses and damage models in order to predict the complex failure mechanisms. This work presents a damage model and progressive failure analysis that requires simple experimental tests and that achieves good accuracy. Firstly, the paper explains damage initiation and propagation criteria and a procedure to identify the material parameters. In the second stage, the model was implemented as a UMAT (User Material Subroutine), which is linked to finite element software, ABAQUS (TM), in order to predict the composite structures behavior. Afterwards, some case studies, mainly off-axis coupons under tensile or compression loads, with different types of stacking sequence were analyzed using the proposed material model. Finally, the computational results were compared to the experimental results, verifying the capability of the damage model in order to predict the composite structure behavior. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a performance analysis of a baseband multiple-input single-output ultra-wideband system over scenarios CM1 and CM3 of the IEEE 802.15.3a channel model, incorporating four different schemes of pre-distortion: time reversal, zero-forcing pre-equaliser, constrained least squares pre-equaliser, and minimum mean square error pre-equaliser. For the third case, a simple solution based on the steepest-descent (gradient) algorithm is adopted and compared with theoretical results. The channel estimations at the transmitter are assumed to be truncated and noisy. Results show that the constrained least squares algorithm has a good trade-off between intersymbol interference reduction and signal-to-noise ratio preservation, providing a performance comparable to the minimum mean square error method but with lower computational complexity. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
XML similarity evaluation has become a central issue in the database and information communities, its applications ranging over document clustering, version control, data integration and ranked retrieval. Various algorithms for comparing hierarchically structured data, XML documents in particular, have been proposed in the literature. Most of them make use of techniques for finding the edit distance between tree structures, XML documents being commonly modeled as Ordered Labeled Trees. Yet, a thorough investigation of current approaches led us to identify several similarity aspects, i.e., sub-tree related structural and semantic similarities, which are not sufficiently addressed while comparing XML documents. In this paper, we provide an integrated and fine-grained comparison framework to deal with both structural and semantic similarities in XML documents (detecting the occurrences and repetitions of structurally and semantically similar sub-trees), and to allow the end-user to adjust the comparison process according to her requirements. Our framework consists of four main modules for (i) discovering the structural commonalities between sub-trees, (ii) identifying sub-tree semantic resemblances, (iii) computing tree-based edit operations costs, and (iv) computing tree edit distance. Experimental results demonstrate higher comparison accuracy with respect to alternative methods, while timing experiments reflect the impact of semantic similarity on overall system performance.
Resumo:
The aim of the present study is to contribute an ecologically relevant assessment of the ecotoxicological effects of pesticide applications in agricultural areas in the tropics, using an integrated approach with information gathered from soil and aquatic compartments. Carbofuran, an insecticide/nematicide used widely on sugarcane crops, was selected as a model substance. To evaluate the toxic effects of pesticide spraying for soil biota, as well as the potential indirect effects on aquatic biota resulting from surface runoff and/or leaching, field and laboratory (using a cost-effective simulator of pesticide applications) trials were performed. Standard ecotoxicological tests were performed with soil (Eisenia andrei, Folsomia candida, and Enchytraeus crypticus) and aquatic (Ceriodaphnia silvestrii) organisms, using serial dilutions of soil, eluate, leachate, and runoff samples. Among soil organisms, sensitivity was found to be E. crypticus < E. andrei < F. candida. Among the aqueous extracts, mortality of C. silvestrii was extreme in runoff samples, whereas eluates were by far the least toxic samples. A generally higher toxicity was found in the bioassays performed with samples from the field trial, indicating the need for improvements in the laboratory simulator. However, the tool developed proved to be valuable in evaluating the toxic effects of pesticide spraying in soils and the potential risks for aquatic compartments. Environ. Toxicol. Chem. 2012;31:437-445. (C) 2011 SETAC
Resumo:
Abstract Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE), "Digital Northern" or Massively Parallel Signature Sequencing (MPSS), is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries") and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.
Resumo:
Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.
Resumo:
Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users' workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system.
Resumo:
The reduction of friction and wear in systems presenting metal-to-metal contacts, as in several mechanical components, represents a traditional challenge in tribology. In this context, this work presents a computational study based on the linear Archard's wear law and finite element modeling (FEM), in order to analyze unlubricated sliding wear observed in typical pin on disc tests. Such modeling was developed using finite element software Abaqus® with 3-D deformable geometries and elastic–plastic material behavior for the contact surfaces. Archard's wear model was implemented into a FORTRAN user subroutine (UMESHMOTION) in order to describe sliding wear. Modeling of debris and oxide formation mechanisms was taken into account by the use of a global wear coefficient obtained from experimental measurements. Such implementation considers an incremental computation for surface wear based on the nodal displacements by means of adaptive mesh tools that rearrange local nodal positions. In this way, the worn track was obtained and new surface profile is integrated for mass loss assessments. This work also presents experimental pin on disc tests with AISI 4140 pins on rotating AISI H13 discs with normal loads of 10, 35, 70 and 140 N, which represent, respectively, mild, transition and severe wear regimes, at sliding speed of 0.1 m/s. Numerical and experimental results were compared in terms of wear rate and friction coefficient. Furthermore, in the numerical simulation the stress field distribution and changes in the surface profile across the worn track of the disc were analyzed. The applied numerical formulation has shown to be more appropriate to predict mild wear regime than severe regime, especially due to the shorter running-in period observed in lower loads that characterizes this kind of regime.
Resumo:
In many countries buildings are responsible for a substantial part of the energy consumption, nd it varies according to their energetic and environmental performances. The potential for major reductions in buildings consumption have bee well documented in Brazil. Opportunities have been identified throughout the life cycle of the buildings, due of projects in diverse locations without the proper adjustments. This article offers a reflection about project processes and how its understanding can be conducted in an integrated way, favoring the use of natural resources and lowering energy consumption. It concludes by indicating that the longest phase in the life cycle of a building is also the phase responsible for its largest energy consumption, not only because of its duration but also for the interaction with the end user. Therefore, in order to harvest the energy cost reduction potential from future buildings designers need a holistic view of the surrounding, end users, materials and methodologies.
Resumo:
The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.