122 resultados para Web content adaptation
Resumo:
Conceitos como Globalização, Internacionalização, Localização e Tradução surgem, na realidade da indústria da língua, com uma grande interligação, mas com perspectivas nem sempre concordantes. O nosso trabalho analisa, por isso, as diferentes definições propostas dedicando especial atenção ao conceito de localização, uma vez que o nosso projecto analisa o processo de localização de uma página Web desenvolvido num contexto empresarial real. O nosso enfoque recai mais específicamente sobre a localização de páginas Web, objecto do nosso projecto, identificando não só as suas particularidades, como também os intervenientes e o tipo de competências necessárias para o desenvolvimento de um trabalho nesta área da tradução e as ferramentas disponíveis para o profissional da tradução/ localização, de que se destacam as freeware. O processo de localização impõe metas de qualidade exigentes, pelo que, partindo da definição do conceito de qualidade, analisamos o tipo de requisitos necessários a uma correcta definição de tarefas e objectivos no contexto da localização. Esta definição de conceitos e a análise do processo de localização suportaram, em seguida, o desenvolvimento do objecto do nosso projecto - o processo de localização da página Web da empresa Pinto & Cruz, ao mesmo tempo que permitiram uma prévia identificação de todos os passos a desenvolver e do tipo de dificuldades e problemas a enfrentar. Assim, e em função das condicionantes impostas pelo modelo de gestão da página, definimos um fluxo de trabalho, em que identificamos as diferentes fases e intervenientes, mediante a utilização de uma plataforma de trabalho disponibilizada pelo webmaster do sítio em questão. O processo seguido para a localização da página é descrito, as suas especificidades documentadas e as dúvidas e dificuldades identificadas. Pretendeu-se, com o desenvolvimento deste projecto e com a descrição feita sistematizar uma abordagem ao processo e alertar para o tipo de dificuldades inerentes à sua prossecução, sobretudo para quem se dispõem a fazê-lo pela primeira vez.
Resumo:
The current models are not simple enough to allow a quick estimation of the remediation time. This work reports the development of an easy and relatively rapid procedure for the forecasting of the remediation time using vapour extraction. Sandy soils contaminated with cyclohexane and prepared with different water contents were studied. The remediation times estimated through the mathematical fitting of experimental results were compared with those of real soils. The main objectives were: (i) to predict, through a simple mathematical fitting, the remediation time of soils with water contents different from those used in the experiments; (ii) to analyse the influence of soil water content on the: (ii1) remediation time; (ii2) remediation efficiency; and (ii3) distribution of contaminants in the different phases present into the soil matrix after the remediation process. For sandy soils with negligible contents of clay and natural organic matter, artificially contaminated with cyclohexane before vapour extraction, it was concluded that (i) if the soil water content belonged to the range considered in the experiments with the prepared soils, then the remediation time of real soils of similar characteristics could be successfully predicted, with relative differences not higher than 10%, through a simple mathematical fitting of experimental results; (ii) increasing soil water content from 0% to 6% had the following consequences: (ii1) increased remediation time (1.8–4.9 h, respectively); (ii2) decreased remediation efficiency (99–97%, respectively); and (ii3) decreased the amount of contaminant adsorbed onto the soil and in the non-aqueous liquid phase, thus increasing the amount of contaminant in the aqueous and gaseous phases.
Resumo:
Abstract This work reports the analysis of the efficiency and time of soil remediation using vapour extraction as well as provides comparison of results using both, prepared and real soils. The main objectives were: (i) to analyse the efficiency and time of remediation according to the water and natural organic matter content of the soil; and (ii) to assess if a previous study, performed using prepared soils, could help to preview the process viability in real conditions. For sandy soils with negligible clay content, artificially contaminated with cyclohexane before vapour extraction, it was concluded that (i) the increase of soil water content and mainly of natural organic matter content influenced negatively the remediation process, making it less efficient, more time consuming, and consequently more expensive; and (ii) a previous study using prepared soils of similar characteristics has proven helpful for previewing the process viability in real conditions.
Resumo:
Nas últimas décadas assistimos a transformações económicas, tecnológicas, políticas e sociais, que influenciaram diretamente o modo de pensar e agir nas organizações. O conceito de competências, com uma valorização crescente, surge como uma alternativa à abordagem da gestão de recursos humanos por funções, respondendo aos desafios atuais do mercado: necessidade de flexibilidade, de adaptação a mudanças contínuas, exigências crescentes do mercado e competitividade das organizações nesse mercado. A área da saúde, e concretamente a profissão de Enfermagem também tem evoluído, surgindo em 2009, uma nova forma de operacionalizar a carreira destes profissionais. No que diz respeito aos enfermeiros com funções de gestão, o conteúdo funcional está descrito, contudo, não existe uma definição clara das competências requeridas para estes profissionais. Este trabalho de investigação, de cariz exploratório, utilizando uma metodologia qualitativa, pretendeu propor uma estratégia de definição de um modelo de competências para os enfermeiros com funções de gestão em Portugal. Para isso, definimos categorias de competências, através da análise da literatura e da legislação. Seguiu-se a realização de entrevistas a um painel de doze peritos, e uma análise de conteúdo dos dados (categorização do tipo misto). Procedemos a uma comparação da recolha empírica de competências com as da recolha teórica, e definimos uma lista de 10 competências para as funções de gestão dos enfermeiros: Competências Técnicas de Gestão; Competências Interpessoais; Comunicação; Gestão de Recursos Humanos; Pensamento Crítico; Conhecimento de Políticas de Saúde; Competências Técnicas de Enfermagem; Organização e Planeamento; Trabalho de Equipa; Preocupação pela Qualidade. De forma a complementar o estudo, pretendemos identificar a perceção das lacunas de competências nos enfermeiros com funções de gestão, e identificar os processos de desenvolvimento de competências considerados mais relevantes para estes profissionais. As lacunas identificadas nas competências dos atuais enfermeiros com funções de gestão, face às mais valorizadas, são reduzidas e dispersas, pelo que consideramos pouco significativas. A forma de desenvolvimento de competências mais valorizado pelo painel de peritos foi a formação (académica e em contexto profissional). Foi também realçada a importância do empenho individual neste processo, assim como a avaliação de competências antes dos enfermeiros assumirem funções de gestão.Consideramos que esta investigação traz contributos quer para a literatura da Gestão por Competências, quer para a literatura da definição de competências das funções dos enfermeiros com funções de gestão, quer para a profissão de enfermagem, (nomeadamente, para as funções de gestão dos enfermeiros), quer para o próprio SNS, já que faz algumas propostas e sugestões para a evolução das práticas de gestão de pessoas.
Resumo:
Folk medicine is a relevant and effective part of indigenous healthcare systems which are, in practice, totally dependent on traditional healers. An outstanding coincidence between indigenous medicinal plant uses and scientifically proved pharmacological properties of several phytochemicals has been observed along the years. This work focused on the leaves of a medicinal plant traditionally used for therapeutic benefits (Angolan Cymbopogon citratus), in order to evaluate their nutritional value. The bioactive phytochemical composition and antioxidant activity of leaf extracts prepared with different solvents (water, methanol and ethanol) were also evaluated. The plant leaves contained ~60% of carbohydrates, protein (~20%), fat (~5%), ash (~4%) and moisture (~9%). The phytochemicals screening revealed the presence of tannins, flavonoids, and terpenoids in all extracts. Methanolic extracts also contained alkaloids and steroids. Several methods were used to evaluate total antioxidant capacity of the different extracts (DPPH; NO; and H2O2 scavenging assays, reducing power, and FRAP). Ethanolic extracts presented a significantly higher antioxidant activity (p < 0.05) except for FRAP, in which the best results were achieved by the aqueous extracts. Methanolic extracts showed the lowest radical scavenging activities for both DPPH; and NO; radicals.
Resumo:
In the last two decades, there was a proliferation of programming exercise formats that hinders interoperability in automatic assessment. In the lack of a widely accepted standard, a pragmatic solution is to convert content among the existing formats. BabeLO is a programming exercise converter providing services to a network of heterogeneous e-learning systems such as contest management systems, programming exercise authoring tools, evaluation engines and repositories of learning objects. Its main feature is the use of a pivotal format to achieve greater extensibility. This approach simplifies the extension to other formats, just requiring the conversion to and from the pivotal format. This paper starts with an analysis of programming exercise formats representative of the existing diversity. This analysis sets the context for the proposed approach to exercise conversion and to the description of the pivotal data format. The abstract service definition is the basis for the design of BabeLO, its components and web service interface. This paper includes a report on the use of BabeLO in two concrete scenarios: to relocate exercises to a different repository, and to use an evaluation engine in a network of heterogeneous systems.
Resumo:
When exploring a virtual environment, realism depends mainly on two factors: realistic images and real-time feedback (motions, behaviour etc.). In this context, photo realism and physical validity of computer generated images required by emerging applications, such as advanced e-commerce, still impose major challenges in the area of rendering research whereas the complexity of lighting phenomena further requires powerful and predictable computing if time constraints must be attained. In this technical report we address the state-of-the-art on rendering, trying to put the focus on approaches, techniques and technologies that might enable real-time interactive web-based clientserver rendering systems. The focus is on the end-systems and not the networking technologies used to interconnect client(s) and server(s).
Resumo:
Broadcast networks that are characterised by having different physical layers (PhL) demand some kind of traffic adaptation between segments, in order to avoid traffic congestion in linking devices. In many LANs, this problem is solved by the actual linking devices, which use some kind of flow control mechanism that either tell transmitting stations to pause (the transmission) or just discard frames. In this paper, we address the case of token-passing fieldbus networks operating in a broadcast fashion and involving message transactions over heterogeneous (wired or wireless) physical layers. For the addressed case, real-time and reliability requirements demand a different solution to the traffic adaptation problem. Our approach relies on the insertion of an appropriate idle time before a station issuing a request frame. In this way, we guarantee that the linking devices’ queues do not increase in a way that the timeliness properties of the overall system turn out to be unsuitable for the targeted applications.
Resumo:
This paper describes how MPEG-4 object based video (obv) can be used to allow selected objects to be inserted into the play-out stream to a specific user based on a profile derived for that user. The application scenario described here is for personalized product placement, and considers the value of this application in the current and evolving commercial media distribution market given the huge emphasis media distributors are currently placing on targeted advertising. This level of application of video content requires a sophisticated content description and metadata system (e.g., MPEG-7). The scenario considers the requirement for global libraries to provide the objects to be inserted into the streams. The paper then considers the commercial trading of objects between the libraries, video service providers, advertising agencies and other parties involved in the service. Consequently a brokerage of video objects is proposed based on negotiation and trading using intelligent agents representing the various parties. The proposed Media Brokerage Platform is a multi-agent system structured in two layers. In the top layer, there is a collection of coarse grain agents representing the real world players – the providers and deliverers of media contents and the market regulator profiler – and, in the bottom layer, there is a set of finer grain agents constituting the marketplace – the delegate agents and the market agent. For knowledge representation (domain, strategic and negotiation protocols) we propose a Semantic Web approach based on ontologies. The media components contents should be represented in MPEG-7 and the metadata describing the objects to be traded should follow a specific ontology. The top layer content providers and deliverers are modelled by intelligent autonomous agents that express their will to transact – buy or sell – media components by registering at a service registry. The market regulator profiler creates, according to the selected profile, a market agent, which, in turn, checks the service registry for potential trading partners for a given component and invites them for the marketplace. The subsequent negotiation and actual transaction is performed by delegate agents in accordance with their profiles and the predefined rules of the market.
Resumo:
This paper proposes a novel business model to support media content personalisation: an agent-based business-to-business (B2B) brokerage platform for media content producer and distributor businesses. Distributors aim to provide viewers with a personalised content experience and producers wish to en-sure that their media objects are watched by as many targeted viewers as possible. In this scenario viewers and media objects (main programmes and candidate objects for insertion) have profiles and, in the case of main programme objects, are annotated with placeholders representing personalisation opportunities, i.e., locations for insertion of personalised media objects. The MultiMedia Brokerage (MMB) platform is a multiagent multilayered brokerage composed by agents that act as sellers and buyers of viewer stream timeslots and/or media objects on behalf of the registered businesses. These agents engage in negotiations to select the media objects that best match the current programme and viewer profiles.
Resumo:
Due to the growing complexity and dynamism of many embedded application domains (including consumer electronics, robotics, automotive and telecommunications), it is increasingly difficult to react to load variations and adapt the system's performance in a controlled fashion within an useful and bounded time. This is particularly noticeable when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may exhibit unrestricted QoS inter-dependencies. This paper proposes a novel anytime adaptive QoS control policy in which the online search for the best set of QoS levels is combined with each user's personal preferences on their services' adaptation behaviour. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.
Resumo:
A QoS adaptation to dynamically changing system conditions that takes into consideration the user’s constraints on the stability of service provisioning is presented. The goal is to allow the system to make QoS adaptation decisions in response to fluctuations in task traffic flow, under the control of the user. We pay special attention to the case where monitoring the stability period and resource load variation of Service Level Agreements for different types of services is used to dynamically adapt future stability periods, according to a feedback control scheme. System’s adaptation behaviour can be configured according to a desired confidence level on future resource usage. The viability of the proposed approach is validated by preliminary experiments.
Resumo:
Fingerprinting is an indoor location technique, based on wireless networks, where data stored during the offline phase is compared with data collected by the mobile device during the online phase. In most of the real-life scenarios, the mobile node used throughout the offline phase is different from the mobile nodes that will be used during the online phase. This means that there might be very significant differences between the Received Signal Strength values acquired by the mobile node and the ones stored in the Fingerprinting Map. As a consequence, this difference between RSS values might contribute to increase the location estimation error. One possible solution to minimize these differences is to adapt the RSS values, acquired during the online phase, before sending them to the Location Estimation Algorithm. Also the internal parameters of the Location Estimation Algorithms, for example the weights of the Weighted k-Nearest Neighbour, might need to be tuned for every type of terminal. This paper focuses both approaches, using Direct Search optimization methods to adapt the Received Signal Strength and to tune the Location Estimation Algorithm parameters. As a result it was possible to decrease the location estimation error originally obtained without any calibration procedure.
Resumo:
Constrained and unconstrained Nonlinear Optimization Problems often appear in many engineering areas. In some of these cases it is not possible to use derivative based optimization methods because the objective function is not known or it is too complex or the objective function is non-smooth. In these cases derivative based methods cannot be used and Direct Search Methods might be the most suitable optimization methods. An Application Programming Interface (API) including some of these methods was implemented using Java Technology. This API can be accessed either by applications running in the same computer where it is installed or, it can be remotely accessed through a LAN or the Internet, using webservices. From the engineering point of view, the information needed from the API is the solution for the provided problem. On the other hand, from the optimization methods researchers’ point of view, not only the solution for the problem is needed. Also additional information about the iterative process is useful, such as: the number of iterations; the value of the solution at each iteration; the stopping criteria, etc. In this paper are presented the features added to the API to allow users to access to the iterative process data.
Resumo:
Nonlinear Optimization Problems are usual in many engineering fields. Due to its characteristics the objective function of some problems might not be differentiable or its derivatives have complex expressions. There are even cases where an analytical expression of the objective function might not be possible to determine either due to its complexity or its cost (monetary, computational, time, ...). In these cases Nonlinear Optimization methods must be used. An API, including several methods and algorithms to solve constrained and unconstrained optimization problems was implemented. This API can be accessed not only as traditionally, by installing it on the developer and/or user computer, but it can also be accessed remotely using Web Services. As long as there is a network connection to the server where the API is installed, applications always access to the latest API version. Also an Web-based application, using the proposed API, was developed. This application is to be used by users that do not want to integrate methods in applications, and simply want to have a tool to solve Nonlinear Optimization Problems.