847 resultados para internet-based application components
Resumo:
Sulfamethoxazole (SMX) is among the antibiotics employed in aquaculture for prophylactic and therapeutic reasons. Environmental and food spread may be prevented by controlling its levels in several stages of fish farming. The present work proposes for this purpose new SMX selective electrodes for the potentiometric determination of this sulphonamide in water. The selective membranes were made of polyvinyl chloride (PVC) with tetraphenylporphyrin manganese (III) chloride or cyclodextrin-based acting as ionophores. 2-nitrophenyl octyl ether was employed as plasticizer and tetraoctylammonium, dimethyldioctadecylammonium bromide or potassium tetrakis (4-chlorophenyl) borate was used as anionic or cationic additive. The best analytical performance was reported for ISEs of tetraphenylporphyrin manganese (III) chloride with 50% mol of potassium tetrakis (4-chlorophenyl) borate compared to ionophore. Nersntian behaviour was observed from 4.0 × 10−5 to 1.0 × 10−2 mol/L (10.0 to 2500 µg/mL), and the limit of detection was 1.2 × 10−5 mol/L (3.0 µg/mL). In general, the electrodes displayed steady potentials in the pH range of 6 to 9. Emf equilibrium was reached before 15 s in all concentration levels. The electrodes revealed good discriminating ability in environmental samples. The analytical application to contaminated waters showed recoveries from 96 to 106%.
Resumo:
20th International Conference on Reliable Software Technologies - Ada-Europe 2015 (Ada-Europe 2015), 22 to 26, Jun, 2015, Madrid, Spain.
Resumo:
A vitamin E extraction method for rainbow trout flesh was optimized, validated, and applied in fish fed commercial and Gracilaria vermiculophylla-supplemented diets. Five extraction methods were compared. Vitamers were analyzed by HPLC/DAD/fluorescence. A solid-liquid extraction with n-hexane, which showed the best performance, was optimized and validated. Among the eight vitamers, only α- and γ-tocopherol were detected in muscle samples. The final method showed good linearity (>0.999), intra- (<3.1%) and inter-day precision (<2.6%), and recoveries (>96%). Detection and quantification limits were 39.9 and 121.0 ng/g of muscle, for α-tocopherol, and 111.4 ng/g and 337.6 ng/g, for γ-tocopherol, respectively. Compared to the control group, the dietary inclusion of 5% G. vermiculophylla resulted in a slight reduction of lipids in muscle and, consequently, of α- and γ-tocopherol. Nevertheless, vitamin E profile in lipids was maintained. In general, the results may be explained by the lower vitamin E level in seaweed-containing diet. Practical Applications: Based on the validation results and the low solvent consumption, the developed method can be used to analyze vitamin E in rainbow trout. The results of this work are also a valuable information source for fish feed industries and aquaculture producers, which can focus on improving seaweed inclusion in feeds as a source of vitamin E in fish muscle and, therefore, take full advantage of all bioactive components with an important role in fish health and flesh quality.
Resumo:
The Internet of Things (IoT) has emerged as a paradigm over the last few years as a result of the tight integration of the computing and the physical world. The requirement of remote sensing makes low-power wireless sensor networks one of the key enabling technologies of IoT. These networks encompass several challenges, especially in communication and networking, due to their inherent constraints of low-power features, deployment in harsh and lossy environments, and limited computing and storage resources. The IPv6 Routing Protocol for Low Power and Lossy Networks (RPL) [1] was proposed by the IETF ROLL (Routing Over Low-power Lossy links) working group and is currently adopted as an IETF standard in the RFC 6550 since March 2012. Although RPL greatly satisfied the requirements of low-power and lossy sensor networks, several issues remain open for improvement and specification, in particular with respect to Quality of Service (QoS) guarantees and support for mobility. In this paper, we focus mainly on the RPL routing protocol. We propose some enhancements to the standard specification in order to provide QoS guarantees for static as well as mobile LLNs. For this purpose, we propose OF-FL (Objective Function based on Fuzzy Logic), a new objective function that overcomes the limitations of the standardized objective functions that were designed for RPL by considering important link and node metrics, namely end-to-end delay, number of hops, ETX (Expected transmission count) and LQL (Link Quality Level). In addition, we present the design of Co-RPL, an extension to RPL based on the corona mechanism that supports mobility in order to overcome the problem of slow reactivity to frequent topology changes and thus providing a better quality of service mainly in dynamic networks application. Performance evaluation results show that both OF-FL and Co-RPL allow a great improvement when compared to the standard specification, mainly in terms of packet loss ratio and average network latency. 2015 Elsevier B.V. Al
Resumo:
The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).
Resumo:
Introdução: Cada vez mais a Reabilitação Cardíaca (RC) tem um início precoce no tratamento e nas necessidades do paciente no sentido de promover a sua autonomia e responsabilização pela recuperação, através de uma abordagem multidisciplinar. Os programas home-based e a inclusão das tecnologias de informação e comunicação são soluções atrativas para o aumento da participação dos doentes selecionados e inclusão de grupos de doentes atualmente sub-representados. Objetivos: Sistematizar a evidência científica atual sobre a efetividade dos programas de reabilitação cardíaca home-based com controlo á distância através da aplicação de novas tecnologias, comparando-a com a reabilitação centre-based/hospital-based, ao nível da adesão e da atividade física. Métodos: Este trabalho consiste numa revisão sistemática da literatura publicada entre 2007 e 2014, através de uma pesquisa em diferentes bases de dados eletrónicas científicas (Elsevier – Science Direct, PEDro, PubMed, Scielo Portugal e B-on) com as palavras-chave: reabilitação cardíaca, home-based, centre-based, hospital-based, reabilitação exercise-based, telemonitorização, smartphone, internet, atividade física, em todas as combinações possíveis. Os estudos foram analisados independentemente por dois revisores quanto aos critérios de inclusão e qualidade dos estudos. Resultados: Dos 101 estudos identificados, apenas dez foram incluídos. Considerando a escala da PEDro, quatro estudos obtiveram um score 5, quatro, um score de 6, e 2 com um score de 7 em 10. Os estudos foram realizados em adultos com idades compreendidas entre os 18 e os 80 anos. Os programas de intervenção dividiram-se em planeamento de atividade física e em autogestão. Todos os programas de exercício físico conduziram a um aumento da capacidade de exercício e consequente, maior controlo de fatores de risco. Pelos níveis de adesão aos PRC home-based e pelos resultados positivos de diferentes parâmetros em relação a reabilitação centre-based/hospital-based é notável a efetividade da telemonitorização baseada em casa. Conclusão: A telemonitorização domiciliária constitui um elemento fundamental para a solução de numerosos problemas destes doentes, tornando-se em métodos simples e de fácil funcionamento para haver sucesso nas taxas de adesão. Com efeito, a utilização das tecnologias de informação e de comunicação permite uma prestação e gestão eficazes dos cuidados de saúde no domicílio.
Resumo:
Trabalho de Projeto apresentado como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
Intern.Conference AZULEJAR, Univ. Aveiro, 10-12 October 2012
Resumo:
The paper presented herein proposes a reliability-based framework for quantifying the structural robustness considering the occurrence of a major earthquake (mainshock) and subsequent cascading hazard events, such as aftershocks that are triggered by the mainshock. These events can significantly increase the probability of failure of buildings, especially for structures that are damaged during the mainshock. The application of the proposed framework is exemplified through three numerical case studies. The case studies correspond to three SAC steel moment frame buildings of 3-, 9-, and 20- stories, which were designed to pre-Northridge codes and standards. Twodimensional nonlinear finite element models of the buildings are developed using the Open System for Earthquake Engineering Simulation framework (OpenSees), using a finite-length plastic hinge beam model and a bilinear constitutive law with deterioration, and are subjected to multiple mainshock-aftershock seismic sequences. For the three buildings analyzed herein, it is shown that the structural reliability under a single seismic event can be significantly different from that under a sequence of seismic events. The reliability-based robustness indicator used shows that the structural robustness is influenced by the extent by which a structure can distribute damage.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química e Bioquímica
Resumo:
Cloud computing has been one of the most important topics in Information Technology which aims to assure scalable and reliable on-demand services over the Internet. The expansion of the application scope of cloud services would require cooperation between clouds from different providers that have heterogeneous functionalities. This collaboration between different cloud vendors can provide better Quality of Services (QoS) at the lower price. However, current cloud systems have been developed without concerns of seamless cloud interconnection, and actually they do not support intercloud interoperability to enable collaboration between cloud service providers. Hence, the PhD work is motivated to address interoperability issue between cloud providers as a challenging research objective. This thesis proposes a new framework which supports inter-cloud interoperability in a heterogeneous computing resource cloud environment with the goal of dispatching the workload to the most effective clouds available at runtime. Analysing different methodologies that have been applied to resolve various problem scenarios related to interoperability lead us to exploit Model Driven Architecture (MDA) and Service Oriented Architecture (SOA) methods as appropriate approaches for our inter-cloud framework. Moreover, since distributing the operations in a cloud-based environment is a nondeterministic polynomial time (NP-complete) problem, a Genetic Algorithm (GA) based job scheduler proposed as a part of interoperability framework, offering workload migration with the best performance at the least cost. A new Agent Based Simulation (ABS) approach is proposed to model the inter-cloud environment with three types of agents: Cloud Subscriber agent, Cloud Provider agent, and Job agent. The ABS model is proposed to evaluate the proposed framework.
Resumo:
This paper studies the drivers of heuristic application in different decision types. The study compares differences in frequencies of heuristic classes' such as recognition, one-reason choice and trade-off applied in, respectively, memory-based and stimulus-based choices as well as in high and low involvement decisions. The study has been conducted online among 205 participants from 28 countries.
Resumo:
The work described in this thesis was performed at the Laboratory for Intense Lasers (L2I) of Instituto Superior Técnico, University of Lisbon (IST-UL). Its main contribution consists in the feasibility study of the broadband dispersive stages for an optical parametric chirped pulse amplifier based on the nonlinear crystal yttrium calcium oxi-borate (YCOB). In particular, the main goal of this work consisted in the characterization and implementation of the several optical devices involved in pulse expansion and compression of the amplified pulses to durations of the order of a few optical cycles (20 fs). This type of laser systems find application in fields such as medicine, telecommunications and machining, which require high energy, ultrashort (sub-100 fs) pulses. The main challenges consisted in the preliminary study of the performance of the broadband amplifier, which is essential for successfully handling pulses with bandwidths exceeding 100 nm when amplified from the μJ to 20 mJ per pulse. In general, the control, manipulation and characterization of optical phenomena on the scale of a few tens of fs and powers that can reach the PW level are extremely difficult and challenging due to the complexity of the phenomena of radiation-matter interaction and their nonlinearities, observed at this time scale and power level. For this purpose the main dispersive components were characterized in detail, specifically addressing the demonstration of pulse expansion and compression. The tested bandwidths are narrower than the final ones, in order to confirm the parameters of these elements and predict the performance for the broadband pulses. The work performed led to additional tasks such as a detailed characterization of laser oscillator seeding the laser chain and the detection and cancelling of additional sources of dispersion.