25 resultados para Eco-informatics

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

At present, the cement industry generates approximately 5% of the world`s anthropogenic CO(2) emissions. This share is expected to increase since demand for cement based products is forecast to multiply by a factor of 2.5 within the next 40 years and the traditional strategies to mitigate emissions, focused on the production of cement, will not be capable of compensating such growth. Therefore, additional mitigation strategies are needed, including an increase in the efficiency of cement use. This paper proposes indicators for measuring cement use efficiency, presents a benchmark based on literature data and discusses potential gains in efficiency. The binder intensity (bi) index measures the amount of binder (kg m(-3)) necessary to deliver 1 MPa of mechanical strength, and consequently express the efficiency of using binder materials. The CO(2) intensity index (ci) allows estimating the global warming potential of concrete formulations. Research benchmarks show that bi similar to 5 kg m(-3) MPa(-1) are feasible and have already been achieved for concretes >50 MPa. However, concretes with lower compressive strengths have binder intensities varying between 10 and 20 kg m(-3) MPa(-1). These values can be a result of the minimum cement content established in many standards and reveal a significant potential for performance gains. In addition, combinations of low bi and ci are shown to be feasible. (c) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJETIVOS: Desenvolver uma proposta educacional on-line sobre o tema úlcera por pressão para alunos e profissionais de enfermagem. MÉTODOS: Pesquisa aplicada, de produção tecnológica, composta pelas etapas de concepção/ planejamento e desenvolvimento, caracterizadas por um conjunto de procedimentos, documentação, digitalização de informações e de imagens. Foram utilizados recursos computacionais didáticos interativos como: o Cybertutor e o Homem Virtual. RESULTADOS: Desenvolvimento de uma proposta educacional virtual sobre úlcera por pressão (UP) dividida em módulos de aprendizagem, contendo lista de discussão, estudos de casos e recursos didáticos, tais como fotos e o Homem Virtual. CONCLUSÕES: Utilizou-se de novas tecnologias educacionais, com a finalidade de promover o aprendizado sobre UP a estudantes de graduação de enfermagem e possibilitar a educação continuada de enfermeiros, uma vez que as UP representam um desafio aos profissionais da saúde e aos serviços de saúde.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUÇÃO: As doenças hepáticas apresentam índices de morbidade e mortalidade elevados e quando em estágio avançado têm o transplante do fígado como forma de tratamento potencialmente curativo e eficaz, embora este não possa ser oferecido a todos os pacientes. Isso faz com que essas doenças sejam consideradas problema de saúde pública em todo o mundo. Os cuidados clínicos para manter o paciente com condições de esperar e suportar o transplante continua um desafio. RELATO DO CASO: Mulher com 65 anos de idade, procedente do Recife, com diagnóstico de cirrose hepática secundária a vírus C apresentava dispnéia importante aos mínimos esforços tendo PaO2 de repouso de 60 mmHg e O2 de 90%, com espirometria normal. Realizou eco-Doppler que evidenciou shunt pulmonar importante. Durante a triagem em lista de transplante (MELD de 16 em agosto de 2006) foi optado pelo início de sessões de oxigenioterapia em câmara hiperbárica a fim de melhorar a sintomatologia respiratória da síndrome hepato-pulmonar. Apresentava melhora substancial da tolerância ao exercício após a terapia hiperbárica, assim como os valores do PaO2 à gasometria. Realizou 10 sessões de oxigenioterapia em câmara hiperbárica. Realizou transplante hepático em outubro de 2007 e vem em acompanhamento ambulatorial com boa evolução e melhora substancial da dispnéia. CONCLUSÃO: Constatou-se melhora da condição hepato-pulmonar após oxigenoterapia hiperbárica. Desta forma, ela surge como mais uma ferramenta para o tratamento das doenças hepáticas, devendo ser realizados outros estudos que avaliem sua utilização clínica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A imagem por ressonância magnética (IRM) é o método de diagnóstico por imagem não invasivo mais sensível para avaliar as partes moles, particularmente o encéfalo, porém trata-se de uma técnica onerosa. O método fundamenta-se no fenômeno da ressonância magnética nuclear que ocorre quando núcleos atômicos com propriedades magnéticas presentes no corpo são submetidos a um campo magnético intenso, sendo posteriormente excitados por energia de radiofrequência e gerando, por sua vez, um sinal de onda de radiofrequência capaz de ser captado por uma antena receptora, passando por um processo matemático, chamado Transformada de Fourier, para posterior formação da imagem. Esse estudo objetivou realizar 10 exames completos da cabeça em cadáveres de cães normais à IRM e confeccionar um Atlas com as estruturas identificadas. As imagens foram adquiridas em um aparelho de ressonância magnética Gyroscan S15/HP Philips com campo magnético de 1,5Tesla. Os cadáveres foram posicionados com a cabeça no interior de uma bobina de cabeça humana e foram submetidos a cortes iniciais sagitais a partir de onde se planejou os cortes transversais e dorsais nas sequências de pulso spin-eco T1, T2 e DP. Em T1 utilizou-se TR=400ms e TE=30ms, T2 utilizou-se TR=2000ms e TE=80ms e na DP utilizou-se TR=2000ms e TE=30ms. A espessura do corte foi de 4mm, o número de médias foi igual a 2, a matriz foi de 256x256, o fator foi igual a 1,0 e o campo de visão foi de 14cm. A duração do exame completo da cabeça foi de 74,5minutos. As imagens obtidas com as sequências utilizadas e com a bobina de cabeça humana foram de boa qualidade. Em T1 a gordura tornou-se hiperintensa e o líquido hipointenso. Em T2 a gordura ficou menos hiperintensa e o líquido hiperintenso. A cortical óssea e o ar foram hipointensos em todas as sequências utilizadas devido a baixa densidade de prótons. A sequência DP mostrou o melhor contraste entre a substância branca e cinzenta quando comparada a T2 e a T1. T2 evidenciou o líquido cefalorraquidiano tornando possível a distinção dos sulcos e giros cerebrais. Através do exame de IRM foi possível, pelo contraste, identificar as estruturas ósseas componentes da arquitetura da região, músculos, grandes vasos venosos e arteriais e estruturas do sistema nervoso central, além de elementos do sistema digestório, respiratório e estruturas dos olhos entre outras. Nesse estudo as IRM adquiridas nas sequências T1, DP e T2 foram complementares para o estudo dos aspectos anatômicos da cabeça de cães demonstrando-os com riqueza de detalhes. O tempo requerido para o exame completo da cabeça é compátivel para uso em animais vivos desde que devidamente anestesiados e controlados. Os resultados obtidos por esse trabalho abrem caminho em nosso meio, para o estudo de animais vivos e para o início da investigação de doenças, principalmente as de origem neurológica, visto ser esta técnica excelente para a visibilização do encéfalo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Blends of milk fat and canola oil (MF:CNO) were enzymatically interesterified (EIE) by Rhizopus oryzne lipase immobilized on polysiloxane-polyvinyl alcohol (SiO(2)-PVA) composite, in a solvent-free system. A central composite design (CCD) was used to optimize the reaction, considering the effects of different mass fractions of binary blends of MF:CNO (50:50, 65:35 and 80:20) and temperatures (45, 55 and 65 degrees C) on the composition and texture properties of the interesterified products, taking the interesterification degree (ID) and consistency (at 10 degrees C) as response variables. For the ID variable both mass fraction of milk fat in the blend and temperature were found to be significant, while for the consistency only mass fraction of milk fat was significant. Empiric models for ID and consistency were obtained that allowed establishing the best interesterification conditions: blend with 65 % of milk fat and 35 %, of canola oil, and temperature of 45 degrees C. Under these conditions, the ID was 19.77 %) and the consistency at 10 degrees C was 56 290 Pa. The potential of this eco-friendly process demonstrated that a product could be obtained with the desirable milk fat flavour and better spreadability under refrigerated conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The present work aims at the application of the decision theory to radiological image quality control ( QC) in diagnostic routine. The main problem addressed in the framework of decision theory is to accept or reject a film lot of a radiology service. The probability of each decision of a determined set of variables was obtained from the selected films. Methods: Based on a radiology service routine a decision probability function was determined for each considered group of combination characteristics. These characteristics were related to the film quality control. These parameters were also framed in a set of 8 possibilities, resulting in 256 possible decision rules. In order to determine a general utility application function to access the decision risk, we have used a simple unique parameter called r. The payoffs chosen were: diagnostic's result (correct/incorrect), cost (high/low), and patient satisfaction (yes/no) resulting in eight possible combinations. Results: Depending on the value of r, more or less risk will occur related to the decision-making. The utility function was evaluated in order to determine the probability of a decision. The decision was made with patients or administrators' opinions from a radiology service center. Conclusion: The model is a formal quantitative approach to make a decision related to the medical imaging quality, providing an instrument to discriminate what is really necessary to accept or reject a film or a film lot. The method presented herein can help to access the risk level of an incorrect radiological diagnosis decision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brief interaction of precipitation with a forest canopy can create a high spatial variability of both throughfall and solute deposition. We hypothesized that (i) the variability in natural forest systems is high but depends on system-inherent stability, (ii) the spatial variability of solute deposition shows seasonal dynamics depending on the increase in rainfall frequency, and (iii) spatial patterns persist only in the short-term. The study area in the north-western Brazilian state of Rondonia is subject to a climate with a distinct wet and dry season. We collected rain and throughfall on an event basis during the early wet season (n = 14) and peak of the wet season (n = 14) and analyzed the samples for pH and concentrations of NH4+, Na+, K+, Ca2+ Mg2+,, Cl-, NO3-, SO42- and DOC. The coefficient 3 4 cient of variation for throughfall based on both sampling intervals was 29%, which is at the lower end of values reported from other tropical forest sites, but which is higher than in most temperate forests. Coefficients of variation of solute deposition ranged from 29% to 52%. This heterogeneity of solute deposition is neither particularly high nor particularly tow compared with a range of tropical and temperate forest ecosystems. We observed an increase in solute deposition variability with the progressing wet season, which was explained by a negative correlation between heterogeneity of solute deposition and antecedent dry period. The temporal stability of throughfall. patterns was Low during the early wet season, but gained in stability as the wet season progressed. We suggest that rapid plant growth at the beginning of the rainy season is responsible for the lower stability, whereas less vegetative activity during the later rainy season might favor the higher persistence of ""hot"" and ""cold"" spots of throughfall. quantities. The relatively high stability of throughfall patterns during later stages of the wet season may influence processes at the forest floor and in the soil. Solute deposition patterns showed less clear trends but all patterns displayed a short-term stability only. The weak stability of those patterns is apt to impede the formation of solute deposition -induced biochemical microhabitats in the soil. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accumulation of chemical elements in biological compartments is one of the strategies of tropical species to adapt to a low-nutrient soil. This study focuses on the Atlantic Forest because of its eco-environmental importance as a natural reservoir of chemical elements. About 20 elements were determined by INAA in leaf, soil, litter and epiphyte compartments. There was no seasonality for chemical element concentrations in leaves, which probably indicated the maintainance of chemical elements in this compartment. Considering the estimated quantities, past deforestation events could have released large amounts of chemical elements to the environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Internet users are increasingly using the worldwide web to search for information relating to their health. This situation makes it necessary to create specialized tools capable of supporting users in their searches. Objective: To apply and compare strategies that were developed to investigate the use of the Portuguese version of Medical Subject Headings (MeSH) for constructing an automated classifier for Brazilian Portuguese-language web-based content within or outside of the field of healthcare, focusing on the lay public. Methods: 3658 Brazilian web pages were used to train the classifier and 606 Brazilian web pages were used to validate it. The strategies proposed were constructed using content-based vector methods for text classification, such that Naive Bayes was used for the task of classifying vector patterns with characteristics obtained through the proposed strategies. Results: A strategy named InDeCS was developed specifically to adapt MeSH for the problem that was put forward. This approach achieved better accuracy for this pattern classification task (0.94 sensitivity, specificity and area under the ROC curve). Conclusions: Because of the significant results achieved by InDeCS, this tool has been successfully applied to the Brazilian healthcare search portal known as Busca Saude. Furthermore, it could be shown that MeSH presents important results when used for the task of classifying web-based content focusing on the lay public. It was also possible to show from this study that MeSH was able to map out mutable non-deterministic characteristics of the web. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a framework to build medical training applications by using virtual reality and a tool that helps the class instantiation of this framework. The main purpose is to make easier the building of virtual reality applications in the medical training area, considering systems to simulate biopsy exams and make available deformation, collision detection, and stereoscopy functionalities. The instantiation of the classes allows quick implementation of the tools for such a purpose, thus reducing errors and offering low cost due to the use of open source tools. Using the instantiation tool, the process of building applications is fast and easy. Therefore, computer programmers can obtain an initial application and adapt it to their needs. This tool allows the user to include, delete, and edit parameters in the functionalities chosen as well as storing these parameters for future use. In order to verify the efficiency of the framework, some case studies are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research Foundation of the State of Sao Paulo (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

State of Sao Paulo Research Foundation (FAPESP)

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.