79 resultados para Test Template Framework
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
Teleoperation is a concept born with the rapid evolution of technology, with an intuitive meaning "operate at a distance." The first teleoperation system was created in the mid 1950s, which were handled chemicals. Remote controlled systems are present nowadays in various types of applications. This dissertation presents the development of a mobile application to perform the teleoperation of a mobile service robot. The application integrates a distributed surveillance (the result of a research project QREN) and led to the development of a communication interface between the robot (the result of another QREN project) and the vigilance system. It was necessary to specify a communication protocol between the two systems, which was implemented over a communication framework 0MQ (Zero Message Queue). For the testing, three prototype applications were developed before to perform the test on the robot.
Resumo:
Generally, smart campus applications do not consider the role of the user with his/her position in a university environment, consequently irrelevant information is delivered to the users. This dissertation proposes a location-based access control model, named Smart-RBAC, extending the functionality of Role-based Access Control Model (RBAC) by including user’s location as the contextual attribute, to solve the aforementioned problem. Smart-RBAC model is designed with a focus on content delivery to the user in order to offer a feasible level of flexibility, which was missing in the existing location-based access control models. An instance of the model, derived from Liferay’s RBAC, is implemented by creating a portal application to test and validate the Smart-RBAC model. Additionally, portlet-based applications are developed to assess the suitability of the model in a smart campus environment. The evaluation of the model, based on a popular theoretical framework, demonstrates the model’s capability to achieve some security goals like “Dynamic Separation of Duty” and “Accountability”. We believe that the Smart-RBAC model will improve the existing smart campus applications since it utilizes both, role and location of the user, to deliver content.
Resumo:
Nowadays, the consumption of goods and services on the Internet are increasing in a constant motion. Small and Medium Enterprises (SMEs) mostly from the traditional industry sectors are usually make business in weak and fragile market sectors, where customized products and services prevail. To survive and compete in the actual markets they have to readjust their business strategies by creating new manufacturing processes and establishing new business networks through new technological approaches. In order to compete with big enterprises, these partnerships aim the sharing of resources, knowledge and strategies to boost the sector’s business consolidation through the creation of dynamic manufacturing networks. To facilitate such demand, it is proposed the development of a centralized information system, which allows enterprises to select and create dynamic manufacturing networks that would have the capability to monitor all the manufacturing process, including the assembly, packaging and distribution phases. Even the networking partners that come from the same area have multi and heterogeneous representations of the same knowledge, denoting their own view of the domain. Thus, different conceptual, semantic, and consequently, diverse lexically knowledge representations may occur in the network, causing non-transparent sharing of information and interoperability inconsistencies. The creation of a framework supported by a tool that in a flexible way would enable the identification, classification and resolution of such semantic heterogeneities is required. This tool will support the network in the semantic mapping establishments, to facilitate the various enterprises information systems integration.
Resumo:
As the complexity of markets and the dynamicity of systems evolve, the need for interoperable systems capable of strengthening enterprise communication effectiveness increases. This is particularly significant when it comes to collaborative enterprise networks, like manufacturing supply chains, where several companies work, communicate, and depend on each other, in order to achieve a specific goal. Once interoperability is achieved, that is once all network parties are able to communicate with and understand each other, organisations are able to exchange information along a stable environment that follows agreed laws. However, as markets adapt to new requirements and demands, an evolutionary behaviour is triggered giving space to interoperability problems, thus disrupting the sustainability of interoperability and raising the need to develop monitoring activities capable of detecting and preventing unexpected behaviour. This work seeks to contribute to the development of monitoring techniques for interoperable SOA-based enterprise networks. It focuses on the automatic detection of harmonisation breaking events during real-time communications, and strives to develop and propose a methodological approach to handle these disruptions with minimal or no human intervention, hence providing existing service-based networks with the ability to detect and promptly react to interoperability issues.
Resumo:
Proceedings of the 16th Annual Conference organized by the Insurance Law Association of Serbia and German Foundation for International Legal Co-Operation (IRZ), entitled "Insurance law, governance and transparency: basics of the legal certainty" Palic Serbia, 17-19 April 2015.
Resumo:
Digital Businesses have become a major driver for economic growth and have seen an explosion of new startups. At the same time, it also includes mature enterprises that have become global giants in a relatively short period of time. Digital Businesses have unique characteristics that make the running and management of a Digital Business much different from traditional offline businesses. Digital businesses respond to online users who are highly interconnected and networked. This enables a rapid flow of word of mouth, at a pace far greater than ever envisioned when dealing with traditional products and services. The relatively low cost of incremental user addition has led to a variety of innovation in pricing of digital products, including various forms of free and freemium pricing models. This thesis explores the unique characteristics and complexities of Digital Businesses and its implications on the design of Digital Business Models and Revenue Models. The thesis proposes an Agent Based Modeling Framework that can be used to develop Simulation Models that simulate the complex dynamics of Digital Businesses and the user interactions between users of a digital product. Such Simulation models can be used for a variety of purposes such as simple forecasting, analysing the impact of market disturbances, analysing the impact of changes in pricing models and optimising the pricing for maximum revenue generation or a balance between growth in usage and revenue generation. These models can be developed for a mature enterprise with a large historical record of user growth rate as well as for early stage enterprises without much historical data. Through three case studies, the thesis demonstrates the applicability of the Framework and its potential applications.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
The catastrophic disruption in the USA financial system in the wake of the financial crisis prompted the Federal Reserve to launch a Quantitative Easing (QE) programme in late 2008. In line with Pesaran and Smith (2014), I use a policy effectiveness test to assess whether this massive asset purchase programme was effective in stimulating the economic activity in the USA. Specifically, I employ an Autoregressive Distributed Lag Model (ARDL), in order to obtain a counterfactual for the USA real GDP growth rate. Using data from 1983Q1 to 2009Q4, the results show that the beneficial effects of QE appear to be weak and rather short-lived. The null hypothesis of policy ineffectiveness is not rejected, which suggests that QE did not have a meaningful impact on output growth.
Resumo:
In this work I propose an additional test to be implemented in EDP’s residential electricity use feedback trials, under InovCity’s project scope. The proposed product to be tested consists of an interface between the smart meter and the television, through a set-top box. I provide a theoretical framework of the importance of feedback, an analysis of results from past studies involving smart metering, and a detailed description of my proposal. The results of a self-developed questionnaire related to the proposal and segmentation issues are also analyzed. Finally, general conclusions are drawn and potential future improvements and challenges are presented.
Resumo:
Materials engineering focuses on the assembly of materials´ properties to design new products with the best performance. By using sub-micrometer size materials in the production of composites, it is possible to obtain objects with properties that none of their compounds show individually. Once three-dimensional materials can be easily customized to obtain desired properties, much interest has been paid to nanostructured poly-mers in order to build biocompatible devices. Over the past years, the thermosensitive microgels have become more common in the framework of bio-materials with potential applicability in therapy and/or diagnostics. In addition, high aspect ratio biopolymers fibers have been produced using the cost-effective method called electrospinning. Taking advantage of both microgels and electrospun fibers, surfaces with enhanced functionalities can be obtained and, therefore employed in a wide range of applications. This dissertation reports on the confinement of stimuli-responsive microgels through the colloidal electro-spinning process. The process mainly depends on the composition, properties and patterning of the precur-sor materials within the polymer jet. Microgels as well as the electrospun non-woven mats were investigated to correlate the starting materials with the final morphology of the composite fibers. PNIPAAm and PNIPAAm/Chitosan thermosensitive microgels with different compositions were obtained via surfactant free emulsion polymerization (SFEP) and characterized in terms of chemical structure, morphology, thermal sta-bility, swelling properties and thermosensitivity. Finally, the colloidal electrospinning method was carried out from spinning solutions composed of the stable microgel dispersions (up to a concentration of about 35 wt. % microgels) and a polymer solution of PEO/water/ethanol mixture acting as fiber template solution. The confinement of microgels was confirmed by Scanning Electron Microscopy (SEM). The electrospinning process was statistically analysed providing the optimum set of parameters aimed to minimize the fiber diameter, which give rise to electrospun nanofibers of PNIPAAm microgels/PEO with a mean fiber diameter of 63 ± 25 nm.
Resumo:
Considerando a língua como um produto da sociedade, mas também como um meio fundamental para o estabelecimento de relações entre os homens, procuramos perceber o seu lugar na sociedade globalizada, com o objectivo de desenvolver uma metodologia de análise terminológica que contribua para uma maior qualidade da comunicação especializada na sociedade em rede. Este trabalho está organizado em duas partes, sendo a primeira dedicada à reflexão sobre o papel da língua na sociedade em rede, focando questões essenciais em torno da tensão existente entre o multilinguismo e a hegemonia do inglês enquanto lingua franca, sobretudo no espaço europeu. Interessa-nos, por um lado, reflectir sobre a definição de políticas linguísticas, concretamente na Europa multilingue dos 28, e, por outro, salientar o papel preponderante que a língua tem na transmissão do conhecimento. A segunda parte deste trabalho concretiza a investigação efectuada na primeira com base na análise do relato financeiro, um domínio do saber que não só é inerentemente multilingue ¾ porque a sua aplicação é transnacional ¾ mas também reflecte a tensão identificada na primeira parte, na medida em que o inglês assume, no mundo dos negócios em geral e nos mercados financeiros em particular, o papel hegemónico de lingua franca. A abordagem terminológica que defendemos é semasiológica para fins onomasiológicos, pelo que partimos da análise do texto de especialidade, organizado em corpora de especialidade. Discutimos subsequentemente os resultados da nossa análise com os especialistas que os irão validar e cuja colaboração em diversos vi momentos do processo de análise terminológica e conceptual é fundamental para garantir a qualidade dos recursos terminológicos produzidos. Nesta óptica, exploramos um corpus de textos legislativos no âmbito do Sistema de Normalização Contabilística (SNC), de modo a delinearmos uma metodologia de trabalho que, no futuro, conduzirá à construção de uma base de dados terminológica do relato financeiro. Concomitantemente, efectuamos também um estudo sobre a Estrutura Conceptual do SNC, para o qual elaboramos uma comparação ao nível da tradução especializada no relato financeiro, com base num corpus paralelo composto pela legislação contabilística internacional endossada pela União Europeia. Utilizamos o corpus paralelo constituído por textos redigidos originalmente em inglês e traduzidos para português, em articulação com o corpus de especialidade criado com a legislação relativa ao normativo contabilístico português, para testar uma metodologia de extracção de equivalentes. Defendemos, por fim, que a harmonização no relato financeiro para além de se reger por políticas contabilísticas comuns, deve ter subjacentes questões terminológicas. É necessário, portanto, harmonizar a terminologia do relato financeiro, possibilitando aos especialistas uma comunicação em português isenta da interferência do inglês herdado das normas internacionais, através dos dois processos que identificamos: a tradução e a adaptação das Normas Internacionais de Contabilidade.
Resumo:
O fim da Guerra Fria é um caso inédito de mudança pacífica da estrutura internacional, em que os Estados Unidos e a União Soviética transcendem a divisão bipolar para decidir os termos da paz no quadro das instituições que definem o modelo de ordenamento multilateral, consolidando a sua legitimidade. Nesse contexto, ao contrário dos casos precedentes de reconstrução internacional no fim de uma guerra hegemónica, o novo sistema do post-Guerra Fria, caracterizado pela unipolaridade, pela regionalização e pela homogeneização, forma-se num quadro de continuidade institucional. A ordem política do post-Guerra Fria é um sistema misto em que as tensões entre a hierarquia unipolar e a anarquia multipolar, a integração global e a fragmentação regional e a homogeneidade e a heterogeneidade política, ideológica e cultural condicionam as estratégias das potências. As crises internacionais vão pôr à prova a estabilidade da nova ordem e a sua capacidade para garantir mudanças pacíficas. A primeira década do post-Guerra Fria mostra a preponderância dos Estados Unidos e a sua confiança crescente, patente nas Guerras do Golfo Pérsico e dos Balcãs, bem como na crise dos Estreitos da Formosa. A reacção aos atentados do "11 de Setembro" revela uma tentação imperial da potência unipolar, nomeadamente com a invasão do Iraque, que provoca uma crise profunda da comunidade de segurança ocidental. A vulnerabilidade do centro da ordem internacional é confirmada pela crise constitucional europeia e pela crise financeira global. Essas crises não alteram a estrutura de poder mas aceleram a erosão da ordem multilateral e criam um novo quadro de possibilidades para a evolução internacional, que inclui uma escalada dos conflitos num quadro de multipolaridade regional, uma nova polarização entre as potências democráticas conservadoras e uma coligação revisionista autoritária, bem como a restauração de um concerto entre as principais potências internacionais.
Resumo:
This case-study examined the use of the BeGloCal Framework applied to B2C E-commerce, for a fast moving consumer goods European manufacturing firm. It explains how the framework supported the team within the company to identify the right local market as to where to start the project, the problem for the company was to find the most appealing area to invest resources. By going through all the steps of the framework the findings led the company to London (Kensington and Chelsea). It shows how managers should act when they have to find a trade-off between standardization and adaptation.
Resumo:
The year is 2015 and the startup and tech business ecosphere has never seen more activity. In New York City alone, the tech startup industry is on track to amass $8 billion dollars in total funding – the highest in 7 years (CB Insights, 2015). According to the Kauffman Index of Entrepreneurship (2015), this figure represents just 20% of the total funding in the United States. Thanks to platforms that link entrepreneurs with investors, there are simply more funding opportunities than ever, and funding can be initiated in a variety of ways (angel investors, venture capital firms, crowdfunding). And yet, in spite of all this, according to Forbes Magazine (2015), nine of ten startups will fail. Because of the unpredictable nature of the modern tech industry, it is difficult to pinpoint exactly why 90% of startups fail – but the general consensus amongst top tech executives is that “startups make products that no one wants” (Fortune, 2014). In 2011, author Eric Ries wrote a book called The Lean Startup in attempts to solve this all-too-familiar problem. It was in this book where he developed the framework for The Hypothesis-Driven Entrepreneurship Process, an iterative process that aims at proving a market before actually launching a product. Ries discusses concepts such as the Minimum Variable Product, the smallest set of activities necessary to disprove a hypothesis (or business model characteristic). Ries encourages acting briefly and often: if you are to fail, then fail fast. In today’s fast-moving economy, an entrepreneur cannot afford to waste his own time, nor his customer’s time. The purpose of this thesis is to conduct an in-depth of analysis of Hypothesis-Driven Entrepreneurship Process, in order to test market viability of a reallife startup idea, ShowMeAround. This analysis will follow the scientific Lean Startup approach; for the purpose of developing a functional business model and business plan. The objective is to conclude with an investment-ready startup idea, backed by rigorous entrepreneurial study.