884 resultados para Problem analysis
Resumo:
The forest has a crucial ecological role and the continuous forest loss can cause colossal effects on the environment. As Armenia is one of the low forest covered countries in the world, this problem is more critical. Continuous forest disturbances mainly caused by illegal logging started from the early 1990s had a huge damage on the forest ecosystem by decreasing the forest productivity and making more areas vulnerable to erosion. Another aspect of the Armenian forest is the lack of continuous monitoring and absence of accurate estimation of the level of cuts in some years. In order to have insight about the forest and the disturbances in the long period of time we used Landsat TM/ETM + images. Google Earth Engine JavaScript API was used, which is an online tool enabling the access and analysis of a great amount of satellite imagery. To overcome the data availability problem caused by the gap in the Landsat series in 1988- 1998, extensive cloud cover in the study area and the missing scan lines, we used pixel based compositing for the temporal window of leaf on vegetation (June-late September). Subsequently, pixel based linear regression analyses were performed. Vegetation indices derived from the 10 biannual composites for the years 1984-2014 were used for trend analysis. In order to derive the disturbances only in forests, forest cover layer was aggregated and the original composites were masked. It has been found, that around 23% of forests were disturbed during the study period.
Resumo:
INTRODUCTION: A time series study of admissions, deaths and acute cases was conducted in order to evaluate the context of Chagas disease in Pernambuco. METHODS: Data reported to the Information Technology Department of the Brazilian National Health Service between 1980 and 2008 was collected for regions and Federal Units of Brazil; and microregions and municipalities of Pernambuco. Rates (per 100,000 inhabitants) of hospitalization, mortality and acute cases were calculated using a national hospital database (SIH), a national mortality database (SIM) and the national Information System for Notifiable Diseases (SINAN), respectively. RESULTS: The national average for Chagas disease admissions was 0.99 from 1995 to 2008. Pernambuco obtained a mean of 0.39 in the same period, with the highest rates being concentrated in the interior of the state. The state obtained a mean mortality rate of 1.56 between 1980 and 2007, which was lower than the national average (3.66). The mortality rate has tended to decline nationally, while it has remained relatively unchanged in Pernambuco. Interpolating national rates of admissions and deaths, mortality rates were higher than hospitalization rates between 1995 and 2007. The same occurred in Pernambuco, except for 2003. Between 2001 and 2006, rates for acute cases were 0.56 and 0.21 for Brazil and Pernambuco, respectively. CONCLUSIONS: Although a decrease in Chagas mortality has occurred in Brazil, the disease remains a serious public health problem, especially in the Northeast region. It is thus essential that medical care, prevention and control regarding Chagas disease be maintained and improved.
Resumo:
Ion Mobility Spectrometry coupled with Multi Capillary Columns (MCC -IMS) is a fast analytical technique working at atmospheric pressure with high sensitivity and selectivity making it suitable for the analysis of complex biological matrices. MCC-IMS analysis generates its information through a 3D spectrum with peaks, corresponding to each of the substances detected, providing quantitative and qualitative information. Sometimes peaks of different substances overlap, making the quantification of substances present in the biological matrices a difficult process. In the present work we use peaks of isoprene and acetone as a model for this problem. These two volatile organic compounds (VOCs) that when detected by MCC-IMS produce two overlapping peaks. In this work it’s proposed an algorithm to identify and quantify these two peaks. This algorithm uses image processing techniques to treat the spectra and to detect the position of the peaks, and then fits the data to a custom model in order to separate the peaks. Once the peaks are separated it calculates the contribution of each peak to the data.
Resumo:
Release of chloroethene compounds into the environment often results in groundwater contamination, which puts people at risk of exposure by drinking contaminated water. cDCE (cis-1,2-dichloroethene) accumulation on subsurface environments is a common environmental problem due to stagnation and partial degradation of other precursor chloroethene species. Polaromonas sp. strain JS666 apparently requires no exotic growth factors to be used as a bioaugmentation agent for aerobic cDCE degradation. Although being the only suitable microorganism found capable of such, further studies are needed for improving the intrinsic bioremediation rates and fully comprehend the metabolic processes involved. In order to do so, a metabolic model, iJS666, was reconstructed from genome annotation and available bibliographic data. FVA (Flux Variability Analysis) and FBA (Flux Balance Analysis) techniques were used to satisfactory validate the predictive capabilities of the iJS666 model. The iJS666 model was able to predict biomass growth for different previously tested conditions, allowed to design key experiments which should be done for further model improvement and, also, produced viable predictions for the use of biostimulant metabolites in the cDCE biodegradation.
Resumo:
Based on the report for the unit “Métodos Interactivos de Participação e Decisão A” (Interactive methods of participation and decision A), coordinated by Prof. Lia Maldonado Teles de Vasconcelos and Prof. Nuno Miguel Ribeiro Videira Costa. This unit was provided for the PhD Program in Technology Assessment in 2015/2016.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
Autor proof
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto
Resumo:
This paper shows the numerous problems of conventional economic analysis in the evaluation of climate change mitigation policies. The article points out the many limitations, omissions, and the arbitrariness that have characterized most evaluation models applied up until now. These shortcomings, in an almost overwhelming way, have biased the result towards the recommendation of a lower aggressiveness of emission mitigation policies. Consequently, this paper questions whether these results provide an appropriate answer to the problem. Finally, various points that an analysis coherent with sustainable development should take into account are presented.
Resumo:
Domestic action on climate change is increasingly important in the light of the difficulties with international agreements and requires a combination of solutions, in terms of institutions and policy instruments. One way of achieving government carbon policy goals may be the creation of an independent body to advise, set or monitor policy. This paper critically assesses the Committee on Climate Change (CCC), which was created in 2008 as an independent body to help move the UK towards a low carbon economy. We look at the motivation for its creation in terms of: information provision, advice, monitoring, or policy delegation. In particular we consider its ability to overcome a time inconsistency problem by comparing and contrasting it with another independent body, the Monetary Policy Committee of the Bank of England. In practice the Committee on Climate Change appears to be the ‘inverse’ of the Monetary Policy Committee, in that it advises on what the policy goal should be rather than being responsible for achieving it. The CCC incorporates both advisory and monitoring functions to inform government and achieve a credible carbon policy over a long time frame. This is a similar framework to that adopted by Stern (2006), but the CCC operates on a continuing basis. We therefore believe the CCC is best viewed as a "Rolling Stern plus" body. There are also concerns as to how binding the budgets actually are and how the budgets interact with other energy policy goals and instruments, such as Renewable Obligation Contracts and the EU Emissions Trading Scheme. The CCC could potentially be reformed to include: an explicit information provision role; consumption-based accounting of emissions and control of a policy instrument such as a balanced-budget carbon tax.
Resumo:
A statistical methodology is developed by which realised outcomes can be used to identify, for calendar years between 1974 and 2012, when policy makers in ‘advanced’ economies have successfully pursued single objectives of different kinds, or multiple objectives. A simple criterion is then used to distinguish between multiple objectives pure and simple and multiple objectives subject to a price stability constraint. The overall and individual country results which this methodology produces seem broadly plausible. Unconditional and conditional analyses of the inflation and growth associated with different types of objectives reveal that multiple objectives subject to a price stability constraint are associated with roughly as good economic performance as the single objective of inflation. A proposal is then made as to how the remit of an inflation-targeting central bank could be adjusted to allow it to pursue other objectives in extremis without losing the credibility effects associated with inflation targeting.
Resumo:
What genotype should the scientist specify for conducting a database search to try to find the source of a low-template-DNA (lt-DNA) trace? When the scientist answers this question, he or she makes a decision. Here, we approach this decision problem from a normative point of view by defining a decision-theoretic framework for answering this question for one locus. This framework combines the probability distribution describing the uncertainty over the trace's donor's possible genotypes with a loss function describing the scientist's preferences concerning false exclusions and false inclusions that may result from the database search. According to this approach, the scientist should choose the genotype designation that minimizes the expected loss. To illustrate the results produced by this approach, we apply it to two hypothetical cases: (1) the case of observing one peak for allele xi on a single electropherogram, and (2) the case of observing one peak for allele xi on one replicate, and a pair of peaks for alleles xi and xj, i ≠ j, on a second replicate. Given that the probabilities of allele drop-out are defined as functions of the observed peak heights, the threshold values marking the turning points when the scientist should switch from one designation to another are derived in terms of the observed peak heights. For each case, sensitivity analyses show the impact of the model's parameters on these threshold values. The results support the conclusion that the procedure should not focus on a single threshold value for making this decision for all alleles, all loci and in all laboratories.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.