30 resultados para bibliographic reference managers
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Tick-borne zoonoses (TBZ) are emerging diseases worldwide. A large amount of information (e.g. case reports, results of epidemiological surveillance, etc.) is dispersed through various reference sources (ISI and non-ISI journals, conference proceedings, technical reports, etc.). An integrated database-derived from the ICTTD-3 project (http://www.icttd.nl)-was developed in order to gather TBZ records in the (sub-)tropics, collected both by the authors and collaborators worldwide. A dedicated website (http://www.tickbornezoonoses.org) was created to promote collaboration and circulate information. Data collected are made freely available to researchers for analysis by spatial methods, integrating mapped ecological factors for predicting TBZ risk. The authors present the assembly process of the TBZ database: the compilation of an updated list of TBZ relevant for (sub-)tropics, the database design and its structure, the method of bibliographic search, the assessment of spatial precision of geo-referenced records. At the time of writing, 725 records extracted from 337 publications related to 59 countries in the (sub-)tropics, have been entered in the database. TBZ distribution maps were also produced. Imported cases have been also accounted for. The most important datasets with geo-referenced records were those on Spotted Fever Group rickettsiosis in Latin-America and Crimean-Congo Haemorrhagic Fever in Africa. The authors stress the need for international collaboration in data collection to update and improve the database. Supervision of data entered remains always necessary. Means to foster collaboration are discussed. The paper is also intended to describe the challenges encountered to assemble spatial data from various sources and to help develop similar data collections.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency's technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
A descentralização do Sistema Único de Saúde (SUS) ainda enfrenta importantes desafios, em particular a busca de alternativas para grandes municípios. Por se caracterizar como um processo eminentemente político, variáveis político-institucionais, dentre as quais se destaca a capacidade de gestão do nível local, são determinantes para a conformação da descentralização em cada contexto. Utilizando o referencial do triângulo de governo para avaliar a capacidade de gestão, realizou-se um estudo de caso, com o objetivo de analisar o processo de descentralização do SUS no Município de São Paulo, Brasil, a maior metrópole brasileira. Pela análise de entrevistas com gestores selecionados e documentos da gestão, identificou-se um movimento de centralização da saúde na gestão municipal 2005-2008, acompanhado do desconcerto das estruturas locorregionais da Secretaria Municipal de Saúde, o que resultou no esvaziamento técnico e político dessas instâncias. Apesar dos limites da descentralização, destaca-se sua potência enquanto estratégia operacional para alcançar os objetivos do SUS. Aponta-se a necessidade de retomar o processo de descentralização da saúde no Município de São Paulo que, além de avançar para instâncias locorregionais, esteja articulado à descentralização da gestão pública municipal.
Resumo:
O objetivo do presente estudo foi avaliar a prevalência de ingestão inadequada de nutrientes em um grupo de adolescentes de São Bernardo do Campo-SP. Dados de consumo de energia e nutrientes foram obtidos por meio de recordatórios de 24 horas aplicados em 89 adolescentes. A prevalência de inadequação foi calculada utilizando o método EAR como ponto de corte, após ajuste pela variabilidade intrapessoal, utilizando o procedimento desenvolvido pela Iowa State University. As Referências de Ingestão Dietética (IDR) foram os valores de referência para ingestão. Para os nutrientes que não possuem EAR estabelecida, a distribuição do consumo foi comparada com a AI. As maiores prevalências de inadequação em ambos sexos foram observadas para o magnésio (99,3 por cento para o sexo masculino e 81,8 por cento para o feminino), zinco (44,0 por cento para o sexo masculino e 23,5 por cento para o feminino), vitamina C (57,2 por cento para o sexo masculino e 59,9 por cento para o feminino) e folato (34,8 por cento para o sexo feminino). A proporção de indivíduos com ingestão superior à AI foi insignificante (menor que 2,0 por cento) em ambos os sexos
Resumo:
With the advent and development of technology, mainly in the Internet, more and more electronic services are being offered to customers in all areas of business, especially in the offering of information services, as in virtual libraries. This article proposes a new opportunity to provide services to virtual libraries customers, presenting a methodology for the implementation of electronic services oriented by these customers' life situations. Through analytical observations of some national virtual libraries sites, it could be identified that the offer of services considering life situations and relationship interest situations can promote the service to their customers, providing greater satisfaction and, consequently, improving quality in the offer of information services. The visits to those sites and the critical analysis of the data collected during these visits, supported by bibliographic researches results, have enabled the description of this methodology, concluding that the provision of services on an isolated way or in accordance with the user's profile on sites of virtual libraries is not always enough to ensure the attendance to the needs and expectations of its customers, which suggests the offering of these services considering life situations and relationship interest situations as a complement that adds value to the business of virtual library. This becomes relevant when indicates new opportunities to provide virtual libraries services with quality, serving as a guide to the information providers managers, enabling the offering of new means to access information services by such customers, looking for pro - activity and services integration, in order to solve definitely real problems.
Resumo:
For obtaining accurate and reliable gene expression results it is essential that quantitative real-time RT-PCR (qRT-PCR) data are normalized with appropriate reference genes. The current exponential increase in postgenomic studies on the honey bee, Apis mellifera, makes the standardization of qRT-PCR results an important task for ongoing community efforts. For this aim we selected four candidate reference genes (actin, ribosomal protein 49, elongation factor 1-alpha, tbp-association factor) and used three software-based approaches (geNorm, BestKeeper and NormFinder) to evaluate the suitability of these genes as endogenous controls. Their expression was examined during honey bee development, in different tissues, and after juvenile hormone exposure. Furthermore, the importance of choosing an appropriate reference gene was investigated for two developmentally regulated target genes. The results led us to consider all four candidate genes as suitable genes for normalization in A. mellifera. However, each condition evaluated in this study revealed a specific set of genes as the most appropriated ones.
Resumo:
A weathering classification for granitic rock materials from southeastern Brazil was framed based on core characteristics. The classification was substantiated by a detailed petrographic study. Indirect assessment of weathering grades by density, ultrasonic and Schmidt hammer index tests was performed. Rebound values due to Schmidt hammer multiple impacts at one representative point were more efficient in predicting weathering grades than averaged single impact rebound values, P-wave velocities and densities. Uniaxial compression tests revealed that a large range of uniaxial compressive strength (214-153 MPa) exists in Grade I category where weathering does not seem to have played any role. It was concluded that variability in occurrences of quartz intragranular cracks and in biotite percentage, distribution and orientation might have played a key role in accelerating or decelerating the failure processes of the Grade I specimens. Deterioration of uniaxial compressive strength and elastic modulus and increase in Poisson`s ratio with increasing weathering intensity could be attributed to alteration of minerals, disruption of rock skeleton and microcrack augmentation. A crude relation between failure modes and weathering grades also emerged.
Resumo:
Determining reference concentrations in rivers and streams is an important tool for environmental management. Reference conditions for eutrophication-related water variables are unavailable for Brazilian freshwaters. We aimed to establish reference baselines for So Paulo State tropical rivers and streams for total phosphorus (TP) and nitrogen (TN), nitrogen-ammonia (NH(4) (+)) and Biochemical Oxygen Demand (BOD) through the best professional judgment and the trisection methods. Data from 319 sites monitored by the So Paulo State Environmental Company (2005 to 2009) and from the 22 Water Resources Management Units in So Paulo State were assessed (N = 27,131). We verified that data from different management units dominated by similar land cover could be analyzed together (Analysis of Variance, P = 0.504). Cumulative frequency diagrams showed that industrialized management units were characterized by the worst water quality (e.g. average TP of 0.51 mg/L), followed by agricultural watersheds. TN and NH(4) (+) were associated with urban percentages and population density (Spearman Rank Correlation Test, P < 0.05). Best professional judgment and trisection (median of lower third of all sites) methods for determining reference concentrations showed agreement: 0.03 & 0.04 mg/L (TP), 0.31 & 0.34 mg/L (TN), 0.06 & 0.10 mg-N/L (NH(4) (+)) and 2 & 2 mg/L (BOD), respectively. Our reference concentrations were similar to TP and TN reference values proposed for temperate water bodies. These baselines can help with water management in So Paulo State, as well as providing some of the first such information for tropical ecosystems.
Resumo:
This paper presents the lifecycle assessment (LCA) of fuel ethanol, as 100% of the vehicle fuel, from sugarcane in Brazil. The functional unit is 10,000 km run in an urban area by a car with a 1,600-cm(3) engine running on fuel hydrated ethanol, and the resulting reference flow is 1,000 kg of ethanol. The product system includes agricultural and industrial activities, distribution, cogeneration of electricity and steam, ethanol use during car driving, and industrial by-products recycling to irrigate sugarcane fields. The use of sugarcane by the ethanol agribusiness is one of the foremost financial resources for the economy of the Brazilian rural area, which occupies extensive areas and provides far-reaching potentials for renewable fuel production. But, there are environmental impacts during the fuel ethanol lifecycle, which this paper intents to analyze, including addressing the main activities responsible for such impacts and indicating some suggestions to minimize the impacts. This study is classified as an applied quantitative research, and the technical procedure to achieve the exploratory goal is based on bibliographic revision, documental research, primary data collection, and study cases at sugarcane farms and fuel ethanol industries in the northeast of SA o pound Paulo State, Brazil. The methodological structure for this LCA study is in agreement with the International Standardization Organization, and the method used is the Environmental Design of Industrial Products. The lifecycle impact assessment (LCIA) covers the following emission-related impact categories: global warming, ozone formation, acidification, nutrient enrichment, ecotoxicity, and human toxicity. The results of the fuel ethanol LCI demonstrate that even though alcohol is considered a renewable fuel because it comes from biomass (sugarcane), it uses a high quantity and diversity of nonrenewable resources over its lifecycle. The input of renewable resources is also high mainly because of the water consumption in the industrial phases, due to the sugarcane washing process. During the lifecycle of alcohol, there is a surplus of electric energy due to the cogeneration activity. Another focus point is the quantity of emissions to the atmosphere and the diversity of the substances emitted. Harvesting is the unit process that contributes most to global warming. For photochemical ozone formation, harvesting is also the activity with the strongest contributions due to the burning in harvesting and the emissions from using diesel fuel. The acidification impact potential is mostly due to the NOx emitted by the combustion of ethanol during use, on account of the sulfuric acid use in the industrial process and because of the NOx emitted by the burning in harvesting. The main consequence of the intensive use of fertilizers to the field is the high nutrient enrichment impact potential associated with this activity. The main contributions to the ecotoxicity impact potential come from chemical applications during crop growth. The activity that presents the highest impact potential for human toxicity (HT) via air and via soil is harvesting. Via water, HT potential is high in harvesting due to lubricant use on the machines. The normalization results indicate that nutrient enrichment, acidification, and human toxicity via air and via water are the most significant impact potentials for the lifecycle of fuel ethanol. The fuel ethanol lifecycle contributes negatively to all the impact potentials analyzed: global warming, ozone formation, acidification, nutrient enrichment, ecotoxicity, and human toxicity. Concerning energy consumption, it consumes less energy than its own production largely because of the electricity cogeneration system, but this process is highly dependent on water. The main causes for the biggest impact potential indicated by the normalization is the nutrient application, the burning in harvesting and the use of diesel fuel. The recommendations for the ethanol lifecycle are: harvesting the sugarcane without burning; more environmentally benign agricultural practices; renewable fuel rather than diesel; not washing sugarcane and implementing water recycling systems during the industrial processing; and improving the system of gases emissions control during the use of ethanol in cars, mainly for NOx. Other studies on the fuel ethanol from sugarcane may analyze in more details the social aspects, the biodiversity, and the land use impact.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Grass reference evapotranspiration (ETo) is an important agrometeorological parameter for climatological and hydrological studies, as well as for irrigation planning and management. There are several methods to estimate ETo, but their performance in different environments is diverse, since all of them have some empirical background. The FAO Penman-Monteith (FAD PM) method has been considered as a universal standard to estimate ETo for more than a decade. This method considers many parameters related to the evapotranspiration process: net radiation (Rn), air temperature (7), vapor pressure deficit (Delta e), and wind speed (U); and has presented very good results when compared to data from lysimeters Populated with short grass or alfalfa. In some conditions, the use of the FAO PM method is restricted by the lack of input variables. In these cases, when data are missing, the option is to calculate ETo by the FAD PM method using estimated input variables, as recommended by FAD Irrigation and Drainage Paper 56. Based on that, the objective of this study was to evaluate the performance of the FAO PM method to estimate ETo when Rn, Delta e, and U data are missing, in Southern Ontario, Canada. Other alternative methods were also tested for the region: Priestley-Taylor, Hargreaves, and Thornthwaite. Data from 12 locations across Southern Ontario, Canada, were used to compare ETo estimated by the FAD PM method with a complete data set and with missing data. The alternative ETo equations were also tested and calibrated for each location. When relative humidity (RH) and U data were missing, the FAD PM method was still a very good option for estimating ETo for Southern Ontario, with RMSE smaller than 0.53 mm day(-1). For these cases, U data were replaced by the normal values for the region and Delta e was estimated from temperature data. The Priestley-Taylor method was also a good option for estimating ETo when U and Delta e data were missing, mainly when calibrated locally (RMSE = 0.40 mm day(-1)). When Rn was missing, the FAD PM method was not good enough for estimating ETo, with RMSE increasing to 0.79 mm day(-1). When only T data were available, adjusted Hargreaves and modified Thornthwaite methods were better options to estimate ETo than the FAO) PM method, since RMSEs from these methods, respectively 0.79 and 0.83 mm day(-1), were significantly smaller than that obtained by FAO PM (RMSE = 1.12 mm day(-1). (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Purpose - Using Brandenburger and Nalebuff`s 1995 co-opetition model as a reference, the purpose of this paper is to seek to develop a tool that, based on the tenets of classical game theory, would enable scholars and managers to identify which games may be played in response to the different conflict of interest situations faced by companies in their business environments. Design/methodology/approach - The literature on game theory and business strategy are reviewed and a conceptual model, the strategic games matrix (SGM), is developed. Two novel games are described and modeled. Findings - The co-opetition model is not sufficient to realistically represent most of the conflict of interest situations faced by companies. It seeks to address this problem through development of the SGM, which expands upon Brandenburger and Nalebuff`s model by providing a broader perspective, through incorporation of an additional dimension (power ratio between players) and three novel, respectively, (rival, individualistic, and associative). Practical implications - This proposed model, based on the concepts of game theory, should be used to train decision- and policy-makers to better understand, interpret and formulate conflict management strategies. Originality/value - A practical and original tool to use game models in conflict of interest situations is generated. Basic classical games, such as Nash, Stackelberg, Pareto, and Minimax, are mapped on the SGM to suggest in which situations they Could be useful. Two innovative games are described to fit four different types of conflict situations that so far have no corresponding game in the literature. A test application of the SGM to a classic Intel Corporation strategic management case, in the complex personal computer industry, shows that the proposed method is able to describe, to interpret, to analyze, and to prescribe optimal competitive and/or cooperative strategies for each conflict of interest situation.
Resumo:
The purpose of this study is to analyze the Controllership relevance as support risk management in non-financial companies. Risk management is a widely discussed and disseminated subject amongst financial institutions. It is obvious that economic uncertainties and, consequently, prevention and. control must also exist in non-financial companies. To enable managers to take safe-decisions, it is essential for them to be able to count on instrumental support that provides timely and adequate information, to ensure lower levels of mistakes and risk exposure. However, discussion concerning risk management in non-financial companies is still in its early stages in Brazil. Considering this gap, this study aims at assessing how Controllership has been acting in? companies under the insight of risk and how it can contribute to risk management in non-financial companies. To achieve the proposed goal, a field research was. carried-out with non-financial companies that are located in the city Sao Paulo and listed in the Sao Paulo Stock Exchange (Bovespa). The research was carried out using questionnaires, which were sent do Risk Officers and Controllers of those companies with the purpose of evaluating their perception on the subject. The results,of the research allow us to conclude that Controllership offers support to risk management, through information that contributes to the mitigation of the risks in non-financial companies.
Resumo:
To determine reference values for tissue Doppler imaging (TDI) and pulsed Doppler echocardiography for left ventricular diastolic function analysis in a healthy Brazilian adult population. Observations were based on a randomly selected healthy population from the city of Vitoria, Espirito Santo, Brazil. Healthy volunteers (n = 275, 61.7% women) without prior histories of cardiovascular disease underwent transthoracic echocardiography. We analyzed 175 individuals by TDI and evaluated mitral annulus E`- and A`-waves from the septum (S) and lateral wall (L) to calculate E`/A` ratios. Using pulsed Doppler echocardiography, we further analyzed the mitral E- and A-waves, E/A ratios, isovolumetric relaxation times (IRTs), and deceleration times (DTs) of 275 individuals. Pulsed Doppler mitral inflow mean values for men were as follows: E-wave: 71 +/- 16 cm/sec, A-wave: 68 +/- 15 cm/sec, IRT: 74.8 +/- 9.2 ms, DT: 206 +/- 32.3 ms, E/A ratio: 1.1 +/- 0.3. Pulsed Doppler mitral inflow mean values for women were as follows: E-wave: 76 +/- 17, A-wave: 69 +/- 14 cm/sec, IRT: 71.2 +/- 10.5 ms, DT: 197 +/- 33.3 ms, E/A ratio: 1.1 +/- 0.3. IRT and DT values were higher in men than in women (P = 0.04 and P = 0.007, respectively). TDI values in men were as follows: E`S: 11 +/- 3 cm/sec, A`S: 13 +/- 2 cm/sec, E`S/A`S: 0.89 +/- 0.2, E`L: 14 +/- 3 cm/sec, A`L: 14 +/- 2 cm/sec, E`L/A`L: 1.1 +/- 0.4. E-wave/ E`S ratio: 6.9 +/- 2.2; E-wave / E`L ratio: 4.9 +/- 1.7. In this study, we determined pulsed Doppler and TDI derived parameters for left ventricular diastolic function in a large sample of healthy Brazilian adults. (Echocardiography 2010;27:777-782).
Concepts and determination of reference values for human biomonitoring of environmental contaminants
Resumo:
Human biomonitoring (HBM) of environmental contaminants plays an important role in estimating exposure and evaluating risk, and thus it has been increasingly applied in the environmental field. The results of HBM must be compared with reference values ( RV). The term ""reference values"" has always been related to the interpretation of clinical laboratory tests. For physicians, RV indicate ""normal values"" or ""limits of normal""; in turn, toxicologists prefer the terms ""background values"" or ""baseline values"" to refer to the presence of contaminants in biological fluids. This discrepancy leads to the discussion concerning which should be the population selected to determine RV. Whereas clinical chemistry employs an altered health state as the main exclusion criterion to select a reference population ( that is, a ""healthy"" population would be selected), in environmental toxicology the exclusion criterion is the abnormal exposure to xenobiotics. Therefore, the choice of population to determine RV is based on the very purpose of the RV to be determined. The present paper discusses the concepts and methodology used to determine RV for biomarkers of chemical environmental contaminants.