24 resultados para Zero order
Resumo:
In the recent past, hardly anyone could predict this course of GIS development. GIS is moving from desktop to cloud. Web 2.0 enabled people to input data into web. These data are becoming increasingly geolocated. Big amounts of data formed something that is called "Big Data". Scientists still don't know how to deal with it completely. Different Data Mining tools are used for trying to extract some useful information from this Big Data. In our study, we also deal with one part of these data - User Generated Geographic Content (UGGC). The Panoramio initiative allows people to upload photos and describe them with tags. These photos are geolocated, which means that they have exact location on the Earth's surface according to a certain spatial reference system. By using Data Mining tools, we are trying to answer if it is possible to extract land use information from Panoramio photo tags. Also, we tried to answer to what extent this information could be accurate. At the end, we compared different Data Mining methods in order to distinguish which one has the most suited performances for this kind of data, which is text. Our answers are quite encouraging. With more than 70% of accuracy, we proved that extracting land use information is possible to some extent. Also, we found Memory Based Reasoning (MBR) method the most suitable method for this kind of data in all cases.
Resumo:
Field Lab Entrepreneurial Innovative Ventures
Analysis of metabolic flux distributions in relation to the extracellular environment in Avian cells
Resumo:
Continuous cell lines that proliferate in chemically defined and simple media have been highly regarded as suitable alternatives for vaccine production. One such cell line is the AG1.CR.pIX avian cell line developed by PROBIOGEN. This cell line can be cultivated in a fully scalable suspension culture and adapted to grow in chemically defined, calf serum free, medium [1]–[5]. The medium composition and cultivation strategy are important factors for reaching high virus titers. In this project, a series of computational methods was used to simulate the cell’s response to different environments. The study is based on the metabolic model of the central metabolism proposed in [1]. In a first step, Metabolic Flux Analysis (MFA) was used along with measured uptake and secretion fluxes to estimate intracellular flux values. The network and data were found to be consistent. In a second step, Flux Balance Analysis (FBA) was performed to access the cell’s biological objective. The objective that resulted in the best predicted results fit to the experimental data was the minimization of oxidative phosphorylation. Employing this objective, in the next step Flux Variability Analysis (FVA) was used to characterize the flux solution space. Furthermore, various scenarios, where a reaction deletion (elimination of the compound from the media) was simulated, were performed and the flux solution space for each scenario was calculated. Growth restrictions caused by essential and non-essential amino acids were accurately predicted. Fluxes related to the essential amino acids uptake and catabolism, the lipid synthesis and ATP production via TCA were found to be essential to exponential growth. Finally, the data gathered during the previous steps were analyzed using principal component analysis (PCA), in order to assess potential changes in the physiological state of the cell. Three metabolic states were found, which correspond to zero, partial and maximum biomass growth rate. Elimination of non-essential amino acids or pyruvate from the media showed no impact on the cell’s assumed normal metabolic state.
Resumo:
The catastrophic disruption in the USA financial system in the wake of the financial crisis prompted the Federal Reserve to launch a Quantitative Easing (QE) programme in late 2008. In line with Pesaran and Smith (2014), I use a policy effectiveness test to assess whether this massive asset purchase programme was effective in stimulating the economic activity in the USA. Specifically, I employ an Autoregressive Distributed Lag Model (ARDL), in order to obtain a counterfactual for the USA real GDP growth rate. Using data from 1983Q1 to 2009Q4, the results show that the beneficial effects of QE appear to be weak and rather short-lived. The null hypothesis of policy ineffectiveness is not rejected, which suggests that QE did not have a meaningful impact on output growth.
Resumo:
Neste estudo aborda-se o novo conceito dos edifícios com necessidades quase nulas de energia (Nearly Zero Energy Buildings, NZEB) no parque edificado Europeu, nomeadamente os edifícios de habitação que serão construídos a partir de 2020. Estes edifícios terão que respeitar os requisitos mínimos de eficiência energética de modo a reduzir a dependência dos combustíveis fósseis que tem vindo a aumentar significativamente ao longo dos últimos anos. Ao longo deste trabalho foi modelado e dimensionado um edifício de habitação unifamiliar situado na zona de Lisboa e recorreu-se a técnicas e estratégias que aumentam a eficiência energética do mesmo. Este dimensionamento foi feito recorrendo ao software DesignBuilder, tendo sido em seguida efetuada uma análise energética, com recurso ao software EnergyPlus. Posteriormente à análise estar feita, realizou-se um estudo utilizando o software SolTerm para se integrar sistemas de produção de energia renovável no edifício, de modo a tentar suprir as necessidades energéticas do edifício no que se refere a aquecimento ambiente, águas quentes sanitárias (AQS) e necessidades elétricas. Para suprir as necessidades de arrefecimento e aquecimento que o sistema solar térmico não tem capacidade de suprir foi introduzida uma bomba de calor, reduzindo assim o consumo elétrico tornando o edifício mais sustentável. Já para produção de energia elétrica foi inserido um sistema solar fotovoltaico. Foi avaliada também nesta fase a relação custo-benefício da aplicação de cada um destes sistemas, de modo a escolher aquele que maximiza o suprimento das necessidades e que ao mesmo tempo é economicamente viável. Por último foi realizada uma análise crítica aos resultados obtidos e é feito o enquadramento com o que é pretendido num edifício NZEB.
Resumo:
The study of AC losses in superconducting pancake coils is of utmost importance for the development of superconducting devices. Due to different technical difficulties this study is usually performed considering one of two approaches: considering superconducting coils of few turns and studying AC losses in a large frequency range vs. superconducting coils with a large number of turns but measuring AC losses only in low frequencies. In this work, a study of AC losses in 128 turn superconducting coils is performed, considering frequencies ranging from 50 Hz till 1152 Hz and currents ranging from zero till the critical current of the coils. Moreover, the study of AC losses considering two different simultaneous harmonic components is also performed and results are compared to the behaviour presented by the coils when operating in a single frequency regime. Different electrical methods are used to verify the total amount of AC losses in the coil and a simple calorimetric method is presented, in order to measure AC losses in a multi-harmonic context. Different analytical and numerical methods are implemented and/or used, to design the superconducting coils and to compute the total amount of AC losses in the superconducting system and a comparison is performed to verify the advantages and drawbacks of each method.
Resumo:
Despite the extensive literature in finding new models to replace the Markowitz model or trying to increase the accuracy of its input estimations, there is less studies about the impact on the results of using different optimization algorithms. This paper aims to add some research to this field by comparing the performance of two optimization algorithms in drawing the Markowitz Efficient Frontier and in real world investment strategies. Second order cone programming is a faster algorithm, appears to be more efficient, but is impossible to assert which algorithm is better. Quadratic Programming often shows superior performance in real investment strategies.
Resumo:
Branding Lab
Resumo:
This work project explores how a male luxury (fashion) brand (subsidiary) that is associated with a luxury car brand (parent company) should develop its communication strategy in order to increase awareness in Europe. For this purpose a quantitative research was conducted. The aim was to find out whether the company in question had low brand awareness among European luxury consumers. Hereafter, a qualitative research revealed important insights in regard to luxury communication among male luxury consumers. Both the results of the research and the recommendations of luxury experts laid the foundation for the development of a solution-oriented communication strategy. The result of the analysis crystallizes the importance of the shared heritage and the synergistic effects, of which the subsidiary should make vast use when communicating.