973 resultados para Eco-informatics
Resumo:
Blends of milk fat and canola oil (MF:CNO) were enzymatically interesterified (EIE) by Rhizopus oryzne lipase immobilized on polysiloxane-polyvinyl alcohol (SiO(2)-PVA) composite, in a solvent-free system. A central composite design (CCD) was used to optimize the reaction, considering the effects of different mass fractions of binary blends of MF:CNO (50:50, 65:35 and 80:20) and temperatures (45, 55 and 65 degrees C) on the composition and texture properties of the interesterified products, taking the interesterification degree (ID) and consistency (at 10 degrees C) as response variables. For the ID variable both mass fraction of milk fat in the blend and temperature were found to be significant, while for the consistency only mass fraction of milk fat was significant. Empiric models for ID and consistency were obtained that allowed establishing the best interesterification conditions: blend with 65 % of milk fat and 35 %, of canola oil, and temperature of 45 degrees C. Under these conditions, the ID was 19.77 %) and the consistency at 10 degrees C was 56 290 Pa. The potential of this eco-friendly process demonstrated that a product could be obtained with the desirable milk fat flavour and better spreadability under refrigerated conditions.
Resumo:
Background: The present work aims at the application of the decision theory to radiological image quality control ( QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods: Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results: Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion: The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.
Resumo:
The brief interaction of precipitation with a forest canopy can create a high spatial variability of both throughfall and solute deposition. We hypothesized that (i) the variability in natural forest systems is high but depends on system-inherent stability, (ii) the spatial variability of solute deposition shows seasonal dynamics depending on the increase in rainfall frequency, and (iii) spatial patterns persist only in the short-term. The study area in the north-western Brazilian state of Rondonia is subject to a climate with a distinct wet and dry season. We collected rain and throughfall on an event basis during the early wet season (n = 14) and peak of the wet season (n = 14) and analyzed the samples for pH and concentrations of NH4+, Na+, K+, Ca2+ Mg2+,, Cl-, NO3-, SO42- and DOC. The coefficient 3 4 cient of variation for throughfall based on both sampling intervals was 29%, which is at the lower end of values reported from other tropical forest sites, but which is higher than in most temperate forests. Coefficients of variation of solute deposition ranged from 29% to 52%. This heterogeneity of solute deposition is neither particularly high nor particularly tow compared with a range of tropical and temperate forest ecosystems. We observed an increase in solute deposition variability with the progressing wet season, which was explained by a negative correlation between heterogeneity of solute deposition and antecedent dry period. The temporal stability of throughfall. patterns was Low during the early wet season, but gained in stability as the wet season progressed. We suggest that rapid plant growth at the beginning of the rainy season is responsible for the lower stability, whereas less vegetative activity during the later rainy season might favor the higher persistence of ""hot"" and ""cold"" spots of throughfall. quantities. The relatively high stability of throughfall patterns during later stages of the wet season may influence processes at the forest floor and in the soil. Solute deposition patterns showed less clear trends but all patterns displayed a short-term stability only. The weak stability of those patterns is apt to impede the formation of solute deposition -induced biochemical microhabitats in the soil. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The accumulation of chemical elements in biological compartments is one of the strategies of tropical species to adapt to a low-nutrient soil. This study focuses on the Atlantic Forest because of its eco-environmental importance as a natural reservoir of chemical elements. About 20 elements were determined by INAA in leaf, soil, litter and epiphyte compartments. There was no seasonality for chemical element concentrations in leaves, which probably indicated the maintainance of chemical elements in this compartment. Considering the estimated quantities, past deforestation events could have released large amounts of chemical elements to the environment.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Introduction: Internet users are increasingly using the worldwide web to search for information relating to their health. This situation makes it necessary to create specialized tools capable of supporting users in their searches. Objective: To apply and compare strategies that were developed to investigate the use of the Portuguese version of Medical Subject Headings (MeSH) for constructing an automated classifier for Brazilian Portuguese-language web-based content within or outside of the field of healthcare, focusing on the lay public. Methods: 3658 Brazilian web pages were used to train the classifier and 606 Brazilian web pages were used to validate it. The strategies proposed were constructed using content-based vector methods for text classification, such that Naive Bayes was used for the task of classifying vector patterns with characteristics obtained through the proposed strategies. Results: A strategy named InDeCS was developed specifically to adapt MeSH for the problem that was put forward. This approach achieved better accuracy for this pattern classification task (0.94 sensitivity, specificity and area under the ROC curve). Conclusions: Because of the significant results achieved by InDeCS, this tool has been successfully applied to the Brazilian healthcare search portal known as Busca Saude. Furthermore, it could be shown that MeSH presents important results when used for the task of classifying web-based content focusing on the lay public. It was also possible to show from this study that MeSH was able to map out mutable non-deterministic characteristics of the web. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.
Resumo:
Research Foundation of the State of Sao Paulo (FAPESP)
Resumo:
State of Sao Paulo Research Foundation (FAPESP)
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
There are several ways of controlling the propagation of a contagious disease. For instance, to reduce the spreading of an airborne infection, individuals can be encouraged to remain in their homes and/or to wear face masks outside their domiciles. However, when a limited amount of masks is available, who should use them: the susceptible subjects, the infective persons or both populations? Here we employ susceptible-infective-recovered (SIR) models described in terms of ordinary differential equations and probabilistic cellular automata in order to investigate how the deletion of links in the random complex network representing the social contacts among individuals affects the dynamics of a contagious disease. The inspiration for this study comes from recent discussions about the impact of measures usually recommended by health public organizations for preventing the propagation of the swine influenza A (H1N1) virus. Our answer to this question can be valid for other eco-epidemiological systems. (C) 2010 Elsevier BM. All rights reserved.
Resumo:
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.
Resumo:
We describe in this paper a new genus and species of cricetid rodent from the Atlantic Forest of Brazil, one of the most endangered eco-regions of the world. The new form displays some but not all synapomorphies of the tribe Oryzomyini, but a suite of unique characteristics is also observed. This new forest rat possesses anatomical characteristics of arboreal taxa, such as very developed plantar pads, but was collected almost exclusively in pitfall traps. Phylogenetic analyses of morphological (integument, soft tissue, cranial, and dental characters) and molecular [nuclear - Interphotoreceptor retinoid binding protein (Irbp) - and mitochondrial - cytochrome b - genes] datasets using maximum likelihood and cladistic parsimony approaches corroborate the inclusion of the new taxon within oryzomyines. The analyses also place the new form as sister species to Eremoryzomys polius, an Andean rat endemic to the Maranon valley. This biogeographical pattern is unusual amongst small terrestrial vertebrates, as a review of the literature points to few other similar examples of Andean-Atlantic Forest pairings, in hylid frogs, Pionus parrots, and other sigmodontine rodents. (C) 2011 The Linnean Society of London, Zoological Journal of the Linnean Society, 2011, 161, 357-390. doi:10.1111/j.1096-3642.2010.00643.x
Resumo:
Objectives: The aim of this study was to determine the insulin-delivery system and the attributes of insulin therapy that best meet patients` preferences, and to estimate patients` willingness-to-pay (WTP) for them. Methods: This was a cross-sectional discrete choice experiment (DCE) study involving 378 Canadian patients with type 1 or type 2 diabetes. Patients were asked to choose between two hypothetical insulin treatment options made up of different combinations of the attribute levels. Regression coefficients derived using conditional logit models were used to calculate patients` WTP. Stratification of the sample was performed to evaluate WTP by predefined subgroups. Results: A total of 274 patients successfully completed the survey. Overall, patients were willing to pay the most for better blood glucose control followed by weight gain. Surprisingly, route of insulin administration was the least important attribute overall. Segmented models indicated that insulin naive diabetics were willing to pay significantly more for both oral and inhaled short-acting insulin compared with insulin users. Surprisingly, type 1 diabetics were willing to pay $C11.53 for subcutaneous short-acting insulin, while type 2 diabetics were willing to pay $C47.23 to avoid subcutaneous short-acting insulin (p < .05). These findings support the hypothesis of a psychological barrier to initiating insulin therapy, but once that this barrier has been overcome, they accommodate and accept injectable therapy as a treatment option. Conclusions: By understanding and addressing patients` preferences for insulin therapy, diabetes educators can use this information to find an optimal treatment approach for each individual patient, which may ultimately lead to improved control, through improved compliance, and better diabetes outcomes.