866 resultados para BEST AVAILABLE TECHNOLOGY


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Summer rainfall over China has experienced substantial variability on longer time scales during the last century, and the question remains whether this is due to natural, internal variability or is part of the emerging signal of anthropogenic climate change. Using the best available observations over China, the decadal variability and recent trends in summer rainfall are investigated with the emphasis on changes in the seasonal evolution and on the temporal characteristics of daily rainfall. The possible relationships with global warming are reassessed. Substantial decadal variability in summer rainfall has been confirmed during the period 1958–2008; this is not unique to this period but is also seen in the earlier decades of the twentieth century. Two dominant patterns of decadal variability have been identified that contribute substantially to the recent trend of southern flooding and northern drought. Natural decadal variability appears to dominate in general but in the cases of rainfall intensity and the frequency of rainfall days, particularly light rain days, then the dominant EOFs have a rather different character, being of one sign over most of China, and having principal components (PCs) that appear more trendlike. The increasing intensity of rainfall throughout China and the decrease in light rainfall days, particularly in the north, could at least partially be of anthropogenic origin, both global and regional, linked to increased greenhouse gases and increased aerosols.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In response to evidence of insect pollinator declines, organisations in many sectors, including the food and farming industry, are investing in pollinator conservation. They are keen to ensure that their efforts use the best available science. We convened a group of 32 ‘conservation practitioners’ with an active interest in pollinators and 16 insect pollinator scientists. The conservation practitioners include representatives from UK industry (including retail), environmental non-government organisations and nature conservation agencies. We collaboratively developed a long list of 246 knowledge needs relating to conservation of wild insect pollinators in the UK. We refined and selected the most important knowledge needs, through a three-stage process of voting and scoring, including discussions of each need at a workshop. We present the top 35 knowledge needs as scored by conservation practitioners or scientists. We find general agreement in priorities identified by these two groups. The priority knowledge needs will structure ongoing work to make science accessible to practitioners, and help to guide future science policy and funding. Understanding the economic benefits of crop pollination, basic pollinator ecology and impacts of pesticides on wild pollinators emerge strongly as priorities, as well as a need to monitor floral resources in the landscape.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper aims to develop a mathematical model based on semi-group theory, which allows to improve quality of service (QoS), including the reduction of the carbon path, in a pervasive environment of a Mobile Virtual Network Operator (MVNO). This paper generalise an interrelationship Machine to Machine (M2M) mathematical model, based on semi-group theory. This paper demonstrates that using available technology and with a solid mathematical model, is possible to streamline relationships between building agents, to control pervasive spaces so as to reduce the impact in carbon footprint through the reduction of GHG.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Numerical climate models constitute the best available tools to tackle the problem of climate prediction. Two assumptions lie at the heart of their suitability: (1) a climate attractor exists, and (2) the numerical climate model's attractor lies on the actual climate attractor, or at least on the projection of the climate attractor on the model's phase space. In this contribution, the Lorenz '63 system is used both as a prototype system and as an imperfect model to investigate the implications of the second assumption. By comparing results drawn from the Lorenz '63 system and from numerical weather and climate models, the implications of using imperfect models for the prediction of weather and climate are discussed. It is shown that the imperfect model's orbit and the system's orbit are essentially different, purely due to model error and not to sensitivity to initial conditions. Furthermore, if a model is a perfect model, then the attractor, reconstructed by sampling a collection of initialised model orbits (forecast orbits), will be invariant to forecast lead time. This conclusion provides an alternative method for the assessment of climate models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Observations by the EISCAT experiments “POLAR” and Common Programme CP-3 reveal non-Maxwellian ion velocity distributions in the auroral F-region ionosphere. Analysis of data from three periods is presented. During the first period, convection velocities are large (≈2 km s-1) and constant over part of a CP-3 latitude scan; the second period is one of POLAR data containing a short-lived (<1 min.) burst of rapid (>1.5 km s-1) flow. We concentrate on these two periods as they allow the study of a great many features of the ion-neutral interactions which drive the plasma non-thermal and provide the best available experimental test for models of the 3-dimensional ion velocity distribution function. The third period is included to illustrate the fact that non-thermal plasma frequently exists in the auroral ionosphere: the data, also from the POLAR experiment, cover a three-hour period of typical auroral zone flow and analysis reveals that the ion distribution varies from Maxwellian to the threshold of a toroidal form.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the feasibility of using approximate Bayesian computation (ABC) to calibrate and evaluate complex individual-based models (IBMs). As ABC evolves, various versions are emerging, but here we only explore the most accessible version, rejection-ABC. Rejection-ABC involves running models a large number of times, with parameters drawn randomly from their prior distributions, and then retaining the simulations closest to the observations. Although well-established in some fields, whether ABC will work with ecological IBMs is still uncertain. Rejection-ABC was applied to an existing 14-parameter earthworm energy budget IBM for which the available data consist of body mass growth and cocoon production in four experiments. ABC was able to narrow the posterior distributions of seven parameters, estimating credible intervals for each. ABC’s accepted values produced slightly better fits than literature values do. The accuracy of the analysis was assessed using cross-validation and coverage, currently the best available tests. Of the seven unnarrowed parameters, ABC revealed that three were correlated with other parameters, while the remaining four were found to be not estimable given the data available. It is often desirable to compare models to see whether all component modules are necessary. Here we used ABC model selection to compare the full model with a simplified version which removed the earthworm’s movement and much of the energy budget. We are able to show that inclusion of the energy budget is necessary for a good fit to the data. We show how our methodology can inform future modelling cycles, and briefly discuss how more advanced versions of ABC may be applicable to IBMs. We conclude that ABC has the potential to represent uncertainty in model structure, parameters and predictions, and to embed the often complex process of optimizing an IBM’s structure and parameters within an established statistical framework, thereby making the process more transparent and objective.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Intelligent Algorithm is designed for theusing a Battery source. The main function is to automate the Hybrid System through anintelligent Algorithm so that it takes the decision according to the environmental conditionsfor utilizing the Photovoltaic/Solar Energy and in the absence of this, Fuel Cell energy isused. To enhance the performance of the Fuel Cell and Photovoltaic Cell we used batterybank which acts like a buffer and supply the current continuous to the load. To develop the main System whlogic based controller was used. Fuzzy Logic based controller used to develop this system,because they are chosen to be feasible for both controlling the decision process and predictingthe availability of the available energy on the basis of current Photovoltaic and Battery conditions. The Intelligent Algorithm is designed to optimize the performance of the system and to selectthe best available energy source(s) in regard of the input parameters. The enhance function of these Intelligent Controller is to predict the use of available energy resources and turn on thatparticular source for efficient energy utilization. A fuzzy controller was chosen to take thedecisions for the efficient energy utilization from the given resources. The fuzzy logic basedcontroller is designed in the Matlab-Simulink environment. Initially, the fuzzy based ruleswere built. Then MATLAB based simulation system was designed and implemented. Thenthis whole proposed model is simulated and tested for the accuracy of design and performanceof the system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Industrial heritage tourism has been in focus for many academic studies and tourism is an alternative developmental tool for mines and contributes to their economic success. This thesis is about the Falu Mine in Dalarna, Sweden, which has World Heritage status since 2001 and is one of the biggest attractions in the region. Its history and cultural importance are reasons for the importance of preserving the heritage. The Falu Mine is under the management of the Great Copper Mountain Trust and one of their ambitions is to ensure the continuous popularity among domestic and international visitors. In order to gain a better understanding of the visitors and to find strategies to improve performance, a visitor survey has been conducted in the summer of 2011. It is the authors believe that the guides of the Falu Mine have the best available insight and that their perceptions help to add to the understanding about the visitors. Therefore, this thesis aims to explore the perceptions of the guides about their visitors, to investigate how the perceptions correspond to the statistical results and to study if there are any differences between domestic and international visitors. The mixed methods approach will increase the depth and accuracy of the results, by linking qualitative with quantitative data. The results show that differences between domestic and international visitors exist, both proven by interviews with the guides and the visitor survey. These differences occur in the factors, such as level of education of the visitors, group size and number of children in the group, knowledge of the visitors prior to and after the visit, sources of information and the fulfillment of the visitor expectations. The perceptions emphasize how these differences impact the guided tours. The guides of the Falu Mine have to be aware of those differences in order to adjust the tour accordingly, as well as the management of the Falu Mine can use this knowledge in order to identify strategies for improving performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Current scientific applications are often structured as workflows and rely on workflow systems to compile abstract experiment designs into enactable workflows that utilise the best available resources. The automation of this step and of the workflow enactment, hides the details of how results have been produced. Knowing how compilation and enactment occurred allows results to be reconnected with the experiment design. We investigate how provenance helps scientists to connect their results with the actual execution that took place, their original experiment and its inputs and parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visamos contribuir para o desenvolvimento do crédito produtivo popular no Brasil. Um passo nesta direção é diminuir a assimetria de informações existentes entre os gestores de políticas públicas e o seu público-alvo. O presente trabalho se beneficia da melhor oportunidade disponível de explorar informações sobre os negócios nanicos, a pesquisa sobre Economia Informal Urbana – ECINF, realizada pelo IBGE em 1997, onde foram entrevistados quase 50000 conta-próprias e empregadores até cinco empregados. Descrevemos a partir da ECINF as formas de acesso ao crédito destes estabelecimentos. O mercado de microcrédito se revela incipiente nas áreas urbanas do país, apenas 7% dos negócios nanicos obtiveram acesso a crédito nos três meses anteriores a pesquisa. Descrevemos o padrão de correlações do uso do crédito produtivo popular com outras variáveis, em particular àquelas ligadas a posse de garantias reais ou colaterais sociais nas áreas urbanas brasileiras. A ligação a alguns elementos do capital social está correlacionada à obtenção do crédito, onde a vantagem aumenta em 33% para quem está associado a algum sindicato, associação ou cooperativa em relação aos que não possuem ligação com estes. A questão da legalidade também apresenta correlação forte para conseguir o acesso a crédito: quem possui constituição jurídica possui uma vantagem de 55% maior em relação aos que não possuem. Destaca-se a variável indicativa da posse de equipamentos, é nesta variável que observamos um dos maiores valores na estimativa, onde a vantagem de quem utiliza é aproximadamente duas vezes maior em relação a quem não utiliza. O fato de estar numa região metropolitana influi pouco na obtenção do crédito, a vantagem é apenas 10% maior do que as pessoas que encontram-se nas demais áreas urbanas. De maneira geral os resultados são consistentes com a importância atribuída na literatura por garantias reais e alternativas na obtenção de crédito.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vivemos em uma sociedade cada vez mais globalizada, atenta, informada e exigente com os serviços prestados pelas organizações públicas e privadas. A tecnologia disponibiliza a um número cada vez maior de pessoas, em tempo real, um conjunto enorme de informações, opiniões, possibilitando que elas interajam com diferentes culturas e, desconsiderando suas distâncias, as unem de forma nunca antes experimentada pela humanidade. A influência da tecnologia vem trazendo consequências tão profundas ao nosso estilo de vida que, a cada nova invenção, novas experiências podem ser vividas e palavras como inovação, reinvenção e mudança estão constantemente em pauta. Nesse contexto é que a gestão do conhecimento vem se tornando cada vez mais importante nos ambientes corporativos, unindo pessoas certas, desenvolvendo e compartilhando novos conhecimentos, enfim, tornando mais fácil lidar com as constantes mudanças e consequentes inseguranças geradas. Assim, este trabalho verificou qual o nível de gestão do conhecimento realizado dentro de um órgão da administração pública federal brasileira, a Marinha do Brasil, e, especificamente no seu Corpo de Intendentes, mediu em oito de suas principais organizações militares, por meio de métodos qualitativos e quantitativos, como essa ferramenta vem sendo utilizada. Ao final, de forma propositiva, com base nos resultados das avaliações realizadas e na revisão teórica, foi sugerida uma estrutura de gestão do conhecimento que possibilite melhorar os atuais níveis de gestão do conhecimento detectados, contribuindo para que ações empreendidas nesse sentido sejam mais eficazes, eficientes e efetivas, acelerando, portanto, o desenvolvimento dessas organizações.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUÇÃO: O hipotireoidismo subclínico (HSC), definido por concentrações elevadas do TSH em face de níveis normais dos hormônios tireoidianos, tem elevada prevalência no Brasil, particularmente entre mulheres e idosos. Embora um número crescente de estudos venha associando o HSC com maior risco de doença arterial coronariana e de mortalidade, não há ensaio clínico randomizado sobre o benefício do tratamento com levotiroxina na redução dos riscos e o tratamento permanece controverso. OBJETIVO: Este consenso, patrocinado pelo Departamento de Tireoide da Sociedade Brasileira de Endocrinologia e Metabologia e desenvolvido por especialistas brasileiros com vasta experiência clínica em tireoide, apresenta recomendações baseadas em evidências para uma abordagem clínica do paciente com HSC no Brasil. MATERIAIS E MÉTODOS: Após estruturação das questões clínicas, a busca das evidências disponíveis na literatura foi realizada inicialmente na base de dados do MedLine-PubMed e posteriormente nas bases Embase e SciELO - Lilacs. A força da evidência, avaliada pelo sistema de classificação de Oxford, foi estabelecida a partir do desenho de estudo utilizado, considerando-se a melhor evidência disponível para cada questão e a experiência brasileira. RESULTADOS: Os temas abordados foram definição e diagnóstico, história natural, significado clínico, tratamento e gestação, que resultaram em 29 recomendações para a abordagem clínica do paciente adulto com HSC. CONCLUSÃO: O tratamento com levotiroxina foi recomendado para todos os pacientes com HSC persistente com níveis séricos do TSH > 10 mU/L e para alguns subgrupos especiais de pacientes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUÇÃO: O hipertireoidismo é caracterizado pelo aumento da síntese e liberação dos hormônios tireoidianos pela glândula tireoide. A tireotoxicose refere-se à síndrome clínica decorrente do excesso de hormônios tireoidianos circulantes, secundário ao hipertireoidismo ou não. Este artigo descreve diretrizes baseadas em evidências clínicas para o manejo da tireotoxicose. OBJETIVO: O presente consenso, elaborado por especialistas brasileiros e patrocinado pelo Departamento de Tireoide da Sociedade Brasileira de Endocrinologia e Metabologia, visa abordar o manejo, diagnóstico e tratamento dos pacientes com tireotoxicose, de acordo com as evidências mais recentes da literatura e adequadas para a realidade clínica do país. MATERIAIS E MÉTODOS: Após estruturação das questões clínicas, foi realizada busca das evidências disponíveis na literatura, inicialmente na base de dados do MedLine-PubMed e posteriormente nas bases Embase e SciELO - Lilacs. A força das evidências, avaliada pelo sistema de classificação de Oxford, foi estabelecida a partir do desenho de estudo utilizado, considerando-se a melhor evidência disponível para cada questão. RESULTADOS: Foram definidas 13 questões sobre a abordagem clínica inicial visando ao diagnóstico e ao tratamento que resultaram em 53 recomendações, incluindo investigação etiológica, tratamento com drogas antitireoidianas, iodo radioativo e cirurgia. Foram abordados ainda o hipertireoidismo em crianças, adolescentes ou pacientes grávidas e o manejo do hipertireoidismo em pacientes com oftalmopatia de Graves e com outras causas diversas de tireotoxicose. CONCLUSÕES: O diagnóstico clínico do hipertireoidismo, geralmente, não oferece dificuldade e a confirmação diagnóstica deverá ser feita com as dosagens das concentrações séricas de TSH e hormônios tireoidianos. O tratamento pode ser realizado com drogas antitireoidianas, administração de radioiodoterapia ou cirurgia de acordo com a etiologia da tireotoxicose, as características clínicas, disponibilidade local de métodos e preferências do médico-assistente e paciente.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper deals with hybrid method for transient stability analysis combining time domain simulation and a direct method. Nowadays, the step-by-step simulation is the best available tool for allowing the uses of detailed models and for providing reliable results. The main limitation of this approach involves the large time of computational simulations and the absence of stability margin. On the other hand, direct methods, that demand less CPU time, did not show ample reliability and applicability yet. The best way seems to be using hybrid solutions, in which a direct method is incorporated in a time domain simulation tool. This work has studied a direct method using the transient potential and kinetic energy of the critical machine only. In this paper the critical machine is identified by a fast and efficient method, and the proposal is new for using to get stability margins from hybrid approaches. Results from systems, like 16-machine, show stability indices to dynamic security assessment. © 2001 IEEE.