919 resultados para Superiority and Inferiority Multi-criteria Ranking (SIR) Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

β-methylamino-L-alanine (BMAA) is a neurotoxin linked to neurodegeneration, which is manifested in the devastating human diseases amyotrophic lateral sclerosis, Alzheimer’s and Parkinson’s disease. This neurotoxin is known to be produced by almost all tested species within the cyanobacterial phylum including free living as well as the symbiotic strains. The global distribution of the BMAA producers ranges from a terrestrial ecosystem on the Island of Guam in the Pacific Ocean to an aquatic ecosystem in Northern Europe, the Baltic Sea, where annually massive surface blooms occur. BMAA had been shown to accumulate in the Baltic Sea food web, with highest levels in the bottom dwelling fish-species as well as in mollusks. One of the aims of this thesis was to test the bottom-dwelling bioaccumulation hypothesis by using a larger number of samples allowing a statistical evaluation. Hence, a large set of fish individuals from the lake Finjasjön, were caught and the BMAA concentrations in different tissues were related to the season of catching, fish gender, total weight and species. The results reveal that fish total weight and fish species were positively correlated with BMAA concentration in the fish brain. Therefore, significantly higher concentrations of BMAA in the brain were detected in plankti-benthivorous fish species and heavier (potentially older) individuals. Another goal was to investigate the potential production of BMAA by other phytoplankton organisms. Therefore, diatom cultures were investigated and confirmed to produce BMAA, even in higher concentrations than cyanobacteria. All diatom cultures studied during this thesis work were show to contain BMAA, as well as one dinoflagellate species. This might imply that the environmental spread of BMAA in aquatic ecosystems is even higher than previously thought. Earlier reports on the concentration of BMAA in different organisms have shown highly variable results and the methods used for quantification have been intensively discussed in the scientific community. In the most recent studies, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become the instrument of choice, due to its high sensitivity and selectivity. Even so, different studies show quite variable concentrations of BMAA. In this thesis, three of the most common BMAA extraction protocols were evaluated in order to find out if the extraction could be one of the sources of variability. It was found that the method involving precipitation of proteins using trichloroacetic acid gave the best performance, complying with all in-house validation criteria. However, extractions of diatom and cyanobacteria cultures with this validated method and quantified using LC-MS/MS still resulted in variable BMAA concentrations, which suggest that also biological reasons contribute to the discrepancies. The current knowledge on the environmental factors that can induce or reduce BMAA production is still limited. In cyanobacteria, production of BMAA was earlier shown to be negative correlated with nitrogen availability – both in laboratory cultures as well as in natural populations. Based on this observation, it was suggested that in unicellular non-diazotrophic cyanobacteria, BMAA might take part in nitrogen metabolism. In order to find out if BMAA has a similar role in diatoms, BMAA was added to two diatom species in culture, in concentrations corresponding to those earlier found in the diatoms. The results suggest that BMAA might induce a nitrogen starvation signal in diatoms, as was earlier observed in cyanobacteria. However, diatoms recover shortly by the extracellular presence of excreted ammonia. Thus, also in diatoms, BMAA might be involved in the nitrogen balance in the cell.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes the joint use of the AHP (Analytic Hierarchy Process) and the ICB (IPMA Competence Baseline), as a tool for the decision-making process of selecting the most suitable managers for projects. A hierarchical structure, comprising the IPMA’s ICB 3.0 contextual, behavioural and technical competence elements, is constructed for the selection of project managers. It also describes the AHP implementation, illustrating the whole process with an example using all the 46 ICB competence elements as model criteria. This tool can be of high interest to decision-makers because it allows comparing the candidates for managing a project using a systematic and rigorous process with a rich set of proven criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current study, we have developed a magnetic resonance imaging-based method for non-invasive detection of complement activation in placenta and foetal brain in vivo in utero. Using this method, we found that anti-complement C3-targeted ultrasmall superparamagnetic iron oxide (USPIO) nanoparticles bind within the inflamed placenta and foetal brain cortical tissue, causing a shortening of the T2* relaxation time. We used two mouse models of pregnancy complications: a mouse model of obstetrics antiphospholipid syndrome (APS) and a mouse model of preterm birth (PTB). We found that detection of C3 deposition in the placenta in the APS model was associated with placental insufficiency characterised by increased oxidative stress, decreased vascular endothelial growth factor and placental growth factor levels and intrauterine growth restriction. We also found that foetal brain C3 deposition was associated with cortical axonal cytoarchitecture disruption and increased neurodegeneration in the mouse model of APS and in the PTB model. In the APS model, foetuses that showed increased C3 in their brains additionally expressed anxiety-related behaviour after birth. Importantly, USPIO did not affect pregnancy outcomes and liver function in the mother and the offspring, suggesting that this method may be useful for detecting complement activation in vivo in utero and predicting placental insufficiency and abnormal foetal neurodevelopment that leads to neuropsychiatric disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a detailed analysis of the application of a multi-scale Hierarchical Reconstruction method for solving a family of ill-posed linear inverse problems. When the observations on the unknown quantity of interest and the observation operators are known, these inverse problems are concerned with the recovery of the unknown from its observations. Although the observation operators we consider are linear, they are inevitably ill-posed in various ways. We recall in this context the classical Tikhonov regularization method with a stabilizing function which targets the specific ill-posedness from the observation operators and preserves desired features of the unknown. Having studied the mechanism of the Tikhonov regularization, we propose a multi-scale generalization to the Tikhonov regularization method, so-called the Hierarchical Reconstruction (HR) method. First introduction of the HR method can be traced back to the Hierarchical Decomposition method in Image Processing. The HR method successively extracts information from the previous hierarchical residual to the current hierarchical term at a finer hierarchical scale. As the sum of all the hierarchical terms, the hierarchical sum from the HR method provides an reasonable approximate solution to the unknown, when the observation matrix satisfies certain conditions with specific stabilizing functions. When compared to the Tikhonov regularization method on solving the same inverse problems, the HR method is shown to be able to decrease the total number of iterations, reduce the approximation error, and offer self control of the approximation distance between the hierarchical sum and the unknown, thanks to using a ladder of finitely many hierarchical scales. We report numerical experiments supporting our claims on these advantages the HR method has over the Tikhonov regularization method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Homomorphic encryption is a particular type of encryption method that enables computing over encrypted data. This has a wide range of real world ramifications such as being able to blindly compute a search result sent to a remote server without revealing its content. In the first part of this thesis, we discuss how database search queries can be made secure using a homomorphic encryption scheme based on the ideas of Gahi et al. Gahi’s method is based on the integer-based fully homomorphic encryption scheme proposed by Dijk et al. We propose a new database search scheme called the Homomorphic Query Processing Scheme, which can be used with the ring-based fully homomorphic encryption scheme proposed by Braserski. In the second part of this thesis, we discuss the cybersecurity of the smart electric grid. Specifically, we use the Homomorphic Query Processing scheme to construct a keyword search technique in the smart grid. Our work is based on the Public Key Encryption with Keyword Search (PEKS) method introduced by Boneh et al. and a Multi-Key Homomorphic Encryption scheme proposed by L´opez-Alt et al. A summary of the results of this thesis (specifically the Homomorphic Query Processing Scheme) is published at the 14th Canadian Workshop on Information Theory (CWIT).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Brazil is internationally acknowledged for its renewable sources, most notably, hydroelectric power plant projects which correspond to 65% of electricity production supply to the National Interconnected System. The main question behind this research is: what are the weights and the relative importance of the variables which have influence on the decision making process for the expansion of hydroelectric generation projects in Parana? The main objective is to propose a multi-criteria decision procedure, in association with water sources options that take into consideration the weight and relative importance of the alternatives having influence on the decision by enterprises in the generation of electricity in the state of Paraná. As far as the approach to the problem is concerned, this research can be classified as having mixed methodologies, applying Content Analysis, Delphi technique and the Analytic Hierarchy Process. Following Delphi methodology, a group of 21 was selected for data collection, all of those linked to Paranaense hydroelectricity market. And the main result was the construction of a decision tree in which it was possible to identify the importance and relative weight of the elements associated with the four dimensions of energy. In environmental dimension, the highest relative weight was placed on the loading capacity of Parana system; the economic dimension, the amortization of investment; in social dimension, the generation of direct work places and in institutional dimension, the availability of suitable sources of financing. Policy makers and business managers make their decisions based on specific criteria related to the organization segment, market information, economic and political behavior among other indicators that guide them in dealing with the typical tradeoffs of projects in hydropower area. The results obtained in the decision tree show that the economic bias is still the main factor in making investment decisions. However, environmental impacts on the State loading capacity, income generation, providing opportunities for direct as well as indirect jobs. And at an institutional level, the absence of funding sources show that the perception of experts is focused on other issues beyond the logic behind development per se. The order of priority of variables in this study indicates that in the current environment of uncertainty in the Brazilian economy as many variables must be analyzed and compared in order to optimize the scarce resources available to expand local development in relation to Paranaense water matrix.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Engenharia Florestal e dos Recursos Naturais - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most new housing in Australia is occurring on greenfield sites on the edges of the capital cities. These housing developments are often criticised for their social and environmental unsustainability. These unsustainable suburbs are a legacy for future generations. They will create dire social and environmental problems if a serious economic downturn was to occur or a resource shortage e.g. oil was to make accessibility impossible. Coupled to these threats is that of the social ‘undesirability’ of isolated suburbs where only those on low incomes made their home. Most of those on higher incomes seek established suburbs which have ‘character’, social amenities and ease of access. Typically, these are in older suburbs close to city centres. This paper describes a methodology that has been developed to analyse past and future housing developments. The results of the analysis can provide a guide to improving the sustainability of these suburbs. The methodology uses several criteria to reflect the fact that no single criterion is adequate to describe or analyse the sustainability of a housing development. Sustainability should embrace social and environmental perspectives, so a multi-criteria analysis is appropriate. The theoretical framework for this methodology has been described elsewhere. However, in this previous work only five criteria were considered: energy use, resource use, neighbourhood character, neighbourhood connectedness and social diversity. In each case, high and low sustainability practice has been identified so that ranking is possible. This paper initially summarizes the way in which these previous five criteria are assessed but then adds a sixth criterion (social connectedness) because of a perceived gap in the previous assessment. The results of an analysis of three suburbs reported in the previous work are updated. They score poorly in terms of social connectedness underlining the need to ‘repair’ these suburbs in order to improve their overall sustainability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Raman spectroscopy is among the primary techniques for the characterisation of graphene materials, as it provides insights into the quality of measured graphenes including their structure and conductivity as well as the presence of dopants. However, our ability to draw conclusions based on such spectra is limited by a lack of understanding regarding the origins of the peaks. Consequently, traditional characterisation techniques, which estimate the quality of the graphene material using the intensity ratio between the D and the G peaks, are unreliable for both GO and rGO. Herein we reanalyse the Raman spectra of graphenes and show that traditional methods rely upon an apparent G peak which is in fact a superposition of the G and D' peaks. We use this understanding to develop a new Raman characterisation method for graphenes that considers the D' peak by using its overtone the 2D'. We demonstrate the superiority and consistency of this method for calculating the oxygen content of graphenes, and use the relationship between the D' peak and graphene quality to define three regimes. This has important implications for purification techniques because, once GO is reduced beyond a critical threshold, further reduction offers limited gain in conductivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All rights reserved. In this paper, we propose and study a unified mixed-integer programming model that simultaneously optimizes fluence weights and multi-leaf collimator (MLC) apertures in the treatment planning optimization of VMAT, Tomotherapy, and CyberKnife. The contribution of our model is threefold: (i) Our model optimizes the fluence and MLC apertures simultaneously for a given set of control points. (ii) Our model can incorporate all volume limits or dose upper bounds for organs at risk (OAR) and dose lower bound limits for planning target volumes (PTV) as hard constraints, but it can also relax either of these constraint sets in a Lagrangian fashion and keep the other set as hard constraints. (iii) For faster solutions, we propose several heuristic methods based on the MIP model, as well as a meta-heuristic approach. The meta-heuristic is very efficient in practice, being able to generate dose- and machinery-feasible solutions for problem instances of clinical scale, e.g., obtaining feasible treatment plans to cases with 180 control points, 6750 sample voxels and 18,000 beamlets in 470 seconds, or cases with 72 control points, 8000 sample voxels and 28,800 beamlets in 352 seconds. With discretization and down-sampling of voxels, our method is capable of tackling a treatment field of 8000-64,000cm3, depending on the ratio of critical structure versus unspecified tissues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Evidence suggests that TV viewing is associated with body mass index (BMI) and metabolic syndrome (MetS) in adolescents. However, it is unclear whether dietary intake mediates these relationships.

METHODS: A cross-sectional analysis was conducted in adolescents (12-19 years) participating in the 2003-2006 United States National Health and Nutrition Examination Survey. BMI z scores (zBMI) (n = 3,161) and MetS (n = 1,379) were calculated using age- and sex-specific criteria for adolescents. TV viewing (h/day) was measured via a self-reported questionnaire, and dietary intake was assessed using two 24-h recalls. Using the MacKinnon method, a series of mediation analyses were conducted examining five dietary mediators (total energy intake, fruit and vegetable intake, discretionary snacks, sugar-sweetened beverages and diet quality) of the relationships between TV viewing and zBMI and MetS.

RESULTS: Small positive relationships were observed between TV viewing and zBMI (β = 0.99, p < 0.001) and TV viewing and MetS (OR = 1.18, p = 0.046). No dietary element appeared to mediate the relationship between TV viewing and zBMI. However, sugar-sweetened beverage consumption and fruit and vegetable intake partially mediated the relationship between TV viewing and MetS, explaining 8.7% and 4.1% of the relationship, respectively.

CONCLUSIONS: These findings highlight the complexity of the relationships between TV viewing, dietary intake and cardiometabolic health outcomes, and that TV viewing should remain a target for interventions.