924 resultados para specification
Resumo:
Tese de dout., Ciências e Tecnologia das Pescas, Faculdade de Ciências do Mar e do Ambiente, Universidade do Algarve, 2005
Resumo:
Montado ecosystem in the Alentejo Region, south of Portugal, has enormous agro-ecological and economics heterogeneities. A definition of homogeneous sub-units among this heterogeneous ecosystem was made, but for them is disposal only partial statistical information about soil allocation agro-forestry activities. The paper proposal is to recover the unknown soil allocation at each homogeneous sub-unit, disaggregating a complete data set for the Montado ecosystem area using incomplete information at sub-units level. The methodological framework is based on a Generalized Maximum Entropy approach, which is developed in thee steps concerning the specification of a r order Markov process, the estimates of aggregate transition probabilities and the disaggregation data to recover the unknown soil allocation at each homogeneous sub-units. The results quality is evaluated using the predicted absolute deviation (PAD) and the "Disagegation Information Gain" (DIG) and shows very acceptable estimation errors.
Resumo:
This study presents and discusses the tsunami hazard posed by an updated CSZ earthquake scenerio to the coastal communities of Port Angeles and Port Townsend, based on the results of a high resolution GeoClaw simulation with 2/3 arc second resolution (about 20.56 meters) surrounding these towns. In addition, we will also present the results of a coarse regional simulation of the Strait of Juan de Fuca. This coarse study encompasses 28 regions that span the Strait’s coast, including the communities of Anacortes, Bellingham, Friday Harbor, and Victoria, BC in addition to extended areas around Port Angeles and Port Townsend. The finest grid for these 28 regions where we collected results had 2 arc sec resolution (around 62 meters). Finally, we will discuss some inherent uncertainties in the specification of the earthquake scenario, the limitations of the GeoClaw model, and the associated uncertainites in the results.
Resumo:
Adulteration of Ginkgo products sold as unregistered supplements within the very large market of Ginkgo products (reputedly £650 million annually) through the post-extraction addition of cheaper (e.g. buckwheat derived) rutin is suspected to allow sub-standard products to appear satisfactory to third parties, e.g. secondary buyers along the value chain or any regulatory authorities. This study was therefore carried out to identify products that did not conform to their label specification and may have been actively adulterated to enable access to the global markets. 500 MHz Bruker NMR spectroscopy instrumentation combined with Topspin version 3.2 and a CAMAG HPTLC system (HPTLC Association for the analysis of Ginkgo biloba leaf) were used to generate NMR spectra (focusing on the 6–8 ppm region for analysis) and chromatograms, respectively. Out of the 35 samples of Ginkgo biloba analysed, 33 were found to contain elevated levels of rutin and/or quercetin, or low levels of Ginkgo metabolites when compared with the reference samples. Samples with disproportional levels of rutin or quercetin compared with other gingko metabolites are likely to be adulterated, either by accident or intentionally, and those samples with low or non-existent gingko metabolite content may have been produced using poor extraction techniques. Only two of the investigated samples were found to match with the High-Performance Thin-Layer Chromatography (HPTLC) fingerprint of the selected reference material. All others deviated significantly. One product contained a 5-hydroxytryptophan derivative, which is not a natural constituent of Ginkgo biloba. Overall, these examples either suggest a poor extraction technique or deliberate adulteration along the value chain. Investigating the ratio of different flavonoids e.g. quercetin and kaempferol using NMR spectroscopy and HPTLC will provide further evidence as to the degree and kind of adulteration of Gingko supplements. From a consumer perspective the equivalence in identity and overall quality of the products needs to be guaranteed for supplements too and not only for products produced according to a quality standard or pharmacopoeial monograph.
Resumo:
Previous research on the prediction of fiscal aggregates has shown evidence that simple autoregressive models often provide better forecasts of fiscal variables than multivariate specifications. We argue that the multivariate models considered by previous studies are small-scale, probably burdened by overparameterization, and not robust to structural changes. Bayesian Vector Autoregressions (BVARs), on the other hand, allow the information contained in a large data set to be summarized efficiently, and can also allow for time variation in both the coefficients and the volatilities. In this paper we explore the performance of BVARs with constant and drifting coefficients for forecasting key fiscal variables such as government revenues, expenditures, and interest payments on the outstanding debt. We focus on both point and density forecasting, as assessments of a country’s fiscal stability and overall credit risk should typically be based on the specification of a whole probability distribution for the future state of the economy. Using data from the US and the largest European countries, we show that both the adoption of a large system and the introduction of time variation help in forecasting, with the former playing a relatively more important role in point forecasting, and the latter being more important for density forecasting.
Resumo:
Meeting European emissions targets is reliant on innovative renewable technologies, particularly ‘renewable heat’ from heat pumps. Heat pump performance is driven by Carnot efficiency and optimum performance requires the lowest possible space heating flow temperatures leading to greater sensitivity to poor design, installation and operation. Does sufficient training and installer capacity exist for this technology? This paper situates the results of heat pump field trial performance in a socio-technical context, identifying how far installer competence requirements are met within the current vocational education and training (VET) system and considers possible futures. Few UK installers have formal heat pump qualifications at National Vocational Qualification (NVQ) level 3 and heat pump VET is generally through short-course provision where the structure of training is largely unregulated with no strict adherence to a common syllabus or a detailed training centre specification. Prerequisites for short-course trainees, specifically the demand for heating system knowledge based on metric design criteria, is limited and proof of ‘experience’ is an accepted alternative to formal educational qualifications. The lack of broader educational content and deficiencies in engineering knowledge will have profound negative impacts on both the performance and market acceptance of heat pumps. Possible futures to address this problem are identified.
Resumo:
Projecto apresentado ao Instituto Superior de Contabilidade e Administração do Porto para a obtenção do Grau de Mestre em Assessoria de Administração
Resumo:
Dissertação apresentada ao Instituto Superior de Contabilidade para a obtenção do Grau de Mestre em Empreendedorismo e Internacionalização Orientada por Professor Doutor José de Freitas Santos
Resumo:
Na indústria farmacêutica, a limpeza dos equipamentos e superfícies é muito importante no processo de fabrico/embalagem dos produtos farmacêuticos. Possíveis resíduos contaminantes devem ser removidos dos equipamentos e das superfícies envolvidas no processo. De acordo com as Boas Práticas de Fabrico (GMP), os procedimentos de limpeza e os métodos analíticos usados para determinar as quantidades de resíduos devem ser validados. O método analítico combinado com o método de amostragem utilizado na colheita de amostras deve ser sujeito a um ensaio de “recovery”. Neste trabalho apresenta-se uma estratégia inovadora para a validação de limpeza de formas farmacêuticas semi-sólidas. Propõe-se o uso de um método de amostragem que consiste na colheita direta de amostra após o seu fabrico, sendo a análise de resíduos feita directamente nesta amostra. Os produtos escolhidos para a avaliação da estratégia foram dois medicamentos dermatológicos, apresentados na forma de pomada e produzidos numa unidade de fabrico de vários produtos, pela Schering Plough Farma/ Merck Sharp & Dohme (Cacém, Portugal). Como métodos analíticos para a quantificação dos resíduos, utilizaram-se métodos validados por via espectrofotométrica (HPLC), usados na análise do produto acabado. A validação de limpeza foi avaliada através da análise de uma quantidade conhecida de pomada (produto B (*)), usando o método de análise da pomada fabricada anteriormente (produto A (*)), de modo a verificar-se a existência ou não de agente de limpeza e substâncias ativas deixadas após a limpeza do produto A, e vice-versa. As concentrações residuais das substâncias ativas e do agente de limpeza encontradas após a limpeza foram nulas, ou seja, inferiores ao limite de deteção (LOD), sendo que o critério de aceitação da limpeza utilizado foi de 6,4 x 10-4 mg/g para a substância ativa 1 (*); 1,0 x 10-2 mg/g para a substância ativa 2 (*); 1,0 x 10-3 mg/g para a substância ativa 3 (*) e de 10 ppm para o agente de limpeza. No ensaio de “recovery”, obtiveram-se resultados acima de 70% para todas as substâncias ativas e para o agente de limpeza nas duas pomadas. Antes de se proceder a este ensaio de “recovery”, houve a necessidade de ajustar as condições cromatográficas dos métodos analíticos de ambos os produtos e do agente de limpeza, por forma a obter-se valores da conformidade do sistema (fator de tailling e de resolução) de acordo com as especificações. A precisão dos resultados, reportada como desvio padrão relativo (RSD), deu abaixo de 2,0%, excepto nos ensaios que envolvem a substância ativa 3, cuja especificação é inferior a 10,0%. Os resultados obtidos demonstraram que os procedimentos de limpeza usados na unidade de fabrico em causa são eficazes, eliminando assim a existência de contaminação cruzada.
Resumo:
The emergence of new business models, namely, the establishment of partnerships between organizations, the chance that companies have of adding existing data on the web, especially in the semantic web, to their information, led to the emphasis on some problems existing in databases, particularly related to data quality. Poor data can result in loss of competitiveness of the organizations holding these data, and may even lead to their disappearance, since many of their decision-making processes are based on these data. For this reason, data cleaning is essential. Current approaches to solve these problems are closely linked to database schemas and specific domains. In order that data cleaning can be used in different repositories, it is necessary for computer systems to understand these data, i.e., an associated semantic is needed. The solution presented in this paper includes the use of ontologies: (i) for the specification of data cleaning operations and, (ii) as a way of solving the semantic heterogeneity problems of data stored in different sources. With data cleaning operations defined at a conceptual level and existing mappings between domain ontologies and an ontology that results from a database, they may be instantiated and proposed to the expert/specialist to be executed over that database, thus enabling their interoperability.
Resumo:
Cloud computing is increasingly being adopted in different scenarios, like social networking, business applications, scientific experiments, etc. Relying in virtualization technology, the construction of these computing environments targets improvements in the infrastructure, such as power-efficiency and fulfillment of users’ SLA specifications. The methodology usually applied is packing all the virtual machines on the proper physical servers. However, failure occurrences in these networked computing systems can induce substantial negative impact on system performance, deviating the system from ours initial objectives. In this work, we propose adapted algorithms to dynamically map virtual machines to physical hosts, in order to improve cloud infrastructure power-efficiency, with low impact on users’ required performance. Our decision making algorithms leverage proactive fault-tolerance techniques to deal with systems failures, allied with virtual machine technology to share nodes resources in an accurately and controlled manner. The results indicate that our algorithms perform better targeting power-efficiency and SLA fulfillment, in face of cloud infrastructure failures.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica com especialização em Energia, Climatização e Refrigeração
Resumo:
Trabalho de Projeto para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Dissertação de Natureza Científica para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Edificações
Resumo:
ARINC specification 653-2 describes the interface between application software and underlying middleware in a distributed real-time avionics system. The real-time workload in this system comprises of partitions, where each partition consists of one or more processes. Processes incur blocking and preemption overheads and can communicate with other processes in the system. In this work we develop compositional techniques for automated scheduling of such partitions and processes. At present, system designers manually schedule partitions based on interactions they have with the partition vendors. This approach is not only time consuming, but can also result in under utilization of resources. In contrast, the technique proposed in this paper is a principled approach for scheduling ARINC-653 partitions and therefore should facilitate system integration.