876 resultados para Computing and software systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

n the last few years, the vision of our connected and intelligent information society has evolved to embrace novel technological and research trends. The diffusion of ubiquitous mobile connectivity and advanced handheld portable devices, amplified the importance of the Internet as the communication backbone for the fruition of services and data. The diffusion of mobile and pervasive computing devices, featuring advanced sensing technologies and processing capabilities, triggered the adoption of innovative interaction paradigms: touch responsive surfaces, tangible interfaces and gesture or voice recognition are finally entering our homes and workplaces. We are experiencing the proliferation of smart objects and sensor networks, embedded in our daily living and interconnected through the Internet. This ubiquitous network of always available interconnected devices is enabling new applications and services, ranging from enhancements to home and office environments, to remote healthcare assistance and the birth of a smart environment. This work will present some evolutions in the hardware and software development of embedded systems and sensor networks. Different hardware solutions will be introduced, ranging from smart objects for interaction to advanced inertial sensor nodes for motion tracking, focusing on system-level design. They will be accompanied by the study of innovative data processing algorithms developed and optimized to run on-board of the embedded devices. Gesture recognition, orientation estimation and data reconstruction techniques for sensor networks will be introduced and implemented, with the goal to maximize the tradeoff between performance and energy efficiency. Experimental results will provide an evaluation of the accuracy of the presented methods and validate the efficiency of the proposed embedded systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis aims at investigating methods and software architectures for discovering what are the typical and frequently occurring structures used for organizing knowledge in the Web. We identify these structures as Knowledge Patterns (KPs). KP discovery needs to address two main research problems: the heterogeneity of sources, formats and semantics in the Web (i.e., the knowledge soup problem) and the difficulty to draw relevant boundary around data that allows to capture the meaningful knowledge with respect to a certain context (i.e., the knowledge boundary problem). Hence, we introduce two methods that provide different solutions to these two problems by tackling KP discovery from two different perspectives: (i) the transformation of KP-like artifacts to KPs formalized as OWL2 ontologies; (ii) the bottom-up extraction of KPs by analyzing how data are organized in Linked Data. The two methods address the knowledge soup and boundary problems in different ways. The first method provides a solution to the two aforementioned problems that is based on a purely syntactic transformation step of the original source to RDF followed by a refactoring step whose aim is to add semantics to RDF by select meaningful RDF triples. The second method allows to draw boundaries around RDF in Linked Data by analyzing type paths. A type path is a possible route through an RDF that takes into account the types associated to the nodes of a path. Then we present K~ore, a software architecture conceived to be the basis for developing KP discovery systems and designed according to two software architectural styles, i.e, the Component-based and REST. Finally we provide an example of reuse of KP based on Aemoo, an exploratory search tool which exploits KPs for performing entity summarization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software evolution research has focused mostly on analyzing the evolution of single software systems. However, it is rarely the case that a project exists as standalone, independent of others. Rather, projects exist in parallel within larger contexts in companies, research groups or even the open-source communities. We call these contexts software ecosystems, and on this paper we present The Small Project Observatory, a prototype tool which aims to support the analysis of project ecosystems through interactive visualization and exploration. We present a case-study of exploring an ecosystem using our tool, we describe about the architecture of the tool, and we distill the lessons learned during the tool-building experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The biggest challenge facing software developers today is how to gracefully evolve complex software systems in the face of changing requirements. We clearly need software systems to be more dynamic, compositional and model-centric, but instead we continue to build systems that are static, baroque and inflexible. How can we better build change-enabled systems in the future? To answer this question, we propose to look back to one of the most successful systems to support change, namely Smalltalk. We briefly introduce Smalltalk with a few simple examples, and draw some lessons for software evolution. Smalltalk's simplicity, its reflective design, and its highly dynamic nature all go a long way towards enabling change in Smalltalk applications. We then illustrate how these lessons work in practice by reviewing a number of research projects that support software evolution by exploiting Smalltalk's design. We conclude by summarizing open issues and challenges for change-enabled systems of the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resilience research has been applied to socioeconomic as well as for agroecological studies in the last 20 years. It provides a conceptual and methodological approach for a better understanding of interrelations between the performance of ecological and social systems. In the research area Alto Beni, Bolivia, the production of cocoa (Theobroma cacao L.), is one of the main sources of income. Farmers in the region have formed producers’ associations to enhance organic cocoa cultivation and obtain fair prices since the 1980s. In cooperation with the long-term system comparisons by the Research Institute of Organic Agriculture (FiBL) in Alto Beni, aspects of the field trial are applied for the use in on-farm research: a comparison of soil fertility, biomass and crop diversity is combined with qualitative interviews and participatory observation methods. Fieldwork is carried out together with Bolivian students through the Swiss KFPE-programme Echanges Universitaires. For the system comparisons, four different land-use types were classified according to their ecological complexity during a preliminary study in 2009: successional agroforestry systems, simple agroforestry systems (both organically managed and certified), traditional systems and conventional monocultures. The study focuses on interrelations between different ways of cocoa cultivation, livelihoods and the related socio-cultural rationales behind them. In particular this second aspect is innovative as it allows to broaden the biophysical perspective to a more comprehensive evaluation with socio-ecological aspects thereby increasing the relevance of the agronomic field studies for development policy and practice. Moreover, such a socio-ecological baseline allows to assess the potential of organic agriculture regarding resilience-building face to socio-environmental stress factors. Among others, the results of the pre-study illustrate local farmers’ perceptions of climate change and the consequences for the different crop-systems: all interviewees mentioned rising temperatures and/or an extended dry season as negative impacts more with regard to their own working conditions than to their crops. This was the case in particular for conventional monocultures and in plots where slash-and-burn cultivation was practised whereas for organic agroforestry systems the advantage of working in the shade was stressed indicating that their relevance rises in the context of climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of the Next Generation Networks, especially the wireless broadband access technologies such as Long Term Evolution (LTE) and Worldwide Interoperability for Microwave Access (WiMAX), have increased the number of "all-IP" networks across the world. The enhanced capabilities of these access networks has spearheaded the cloud computing paradigm, where the end-users aim at having the services accessible anytime and anywhere. The services availability is also related with the end-user device, where one of the major constraints is the battery lifetime. Therefore, it is necessary to assess and minimize the energy consumed by the end-user devices, given its significance for the user perceived quality of the cloud computing services. In this paper, an empirical methodology to measure network interfaces energy consumption is proposed. By employing this methodology, an experimental evaluation of energy consumption in three different cloud computing access scenarios (including WiMAX) were performed. The empirical results obtained show the impact of accurate network interface states management and application network level design in the energy consumption. Additionally, the achieved outcomes can be used in further software-based models to optimized energy consumption, and increase the Quality of Experience (QoE) perceived by the end-users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Location-awareness indoors will be an inseparable feature of mobile services/applications in future wireless networks. Its current ubiquitous availability is still obstructed by technological challenges and privacy issues. We propose an innovative approach towards the concept of indoor positioning with main goal to develop a system that is self-learning and able to adapt to various radio propagation environments. The approach combines estimation of propagation conditions, subsequent appropriate channel modelling and optimisation feedback to the used positioning algorithm. Main advantages of the proposal are decreased system set-up effort, automatic re-calibration and increased precision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The past decade has seen the energy consumption in servers and Internet Data Centers (IDCs) skyrocket. A recent survey estimated that the worldwide spending on servers and cooling have risen to above $30 billion and is likely to exceed spending on the new server hardware . The rapid rise in energy consumption has posted a serious threat to both energy resources and the environment, which makes green computing not only worthwhile but also necessary. This dissertation intends to tackle the challenges of both reducing the energy consumption of server systems and by reducing the cost for Online Service Providers (OSPs). Two distinct subsystems account for most of IDC’s power: the server system, which accounts for 56% of the total power consumption of an IDC, and the cooling and humidifcation systems, which accounts for about 30% of the total power consumption. The server system dominates the energy consumption of an IDC, and its power draw can vary drastically with data center utilization. In this dissertation, we propose three models to achieve energy effciency in web server clusters: an energy proportional model, an optimal server allocation and frequency adjustment strategy, and a constrained Markov model. The proposed models have combined Dynamic Voltage/Frequency Scaling (DV/FS) and Vary-On, Vary-off (VOVF) mechanisms that work together for more energy savings. Meanwhile, corresponding strategies are proposed to deal with the transition overheads. We further extend server energy management to the IDC’s costs management, helping the OSPs to conserve, manage their own electricity cost, and lower the carbon emissions. We have developed an optimal energy-aware load dispatching strategy that periodically maps more requests to the locations with lower electricity prices. A carbon emission limit is placed, and the volatility of the carbon offset market is also considered. Two energy effcient strategies are applied to the server system and the cooling system respectively. With the rapid development of cloud services, we also carry out research to reduce the server energy in cloud computing environments. In this work, we propose a new live virtual machine (VM) placement scheme that can effectively map VMs to Physical Machines (PMs) with substantial energy savings in a heterogeneous server cluster. A VM/PM mapping probability matrix is constructed, in which each VM request is assigned with a probability running on PMs. The VM/PM mapping probability matrix takes into account resource limitations, VM operation overheads, server reliability as well as energy effciency. The evolution of Internet Data Centers and the increasing demands of web services raise great challenges to improve the energy effciency of IDCs. We also express several potential areas for future research in each chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The existence and morphology, as well as the dynamics of micro-scale gas-liquid interfaces is investigated numerically and experimentally. These studies can be used to assess liquid management issues in microsystems such as PEMFC gas flow channels, and are meant to open new research perspectives in two-phase flow, particularly in film deposition on non-wetting surfaces. For example the critical plug volume data can be used to deliver desired length plugs, or to determine the plug formation frequency. The dynamics of gas-liquid interfaces, of interest for applications involving small passages (e.g. heat exchangers, phase separators and filtration systems), was investigated using high-speed microscopy - a method that also proved useful for the study of film deposition processes. The existence limit for a liquid plug forming in a mixed wetting channel is determined by numerical simulations using Surface Evolver. The plug model simulate actual conditions in the gas flow channels of PEM fuel cells, the wetting of the gas diffusion layer (GDL) side of the channel being different from the wetting of the bipolar plate walls. The minimum plug volume, denoted as critical volume is computed for a series of GDL and bipolar plate wetting properties. Critical volume data is meant to assist in the water management of PEMFC, when corroborated with experimental data. The effect of cross section geometry is assessed by computing the critical volume in square and trapezoidal channels. Droplet simulations show that water can be passively removed from the GDL surface towards the bipolar plate if we take advantage on differing wetting properties between the two surfaces, to possibly avoid the gas transport blockage through the GDL. High speed microscopy was employed in two-phase and film deposition experiments with water in round and square capillary tubes. Periodic interface destabilization was observed and the existence of compression waves in the gas phase is discussed by taking into consideration a naturally occurring convergent-divergent nozzle formed by the flowing liquid phase. The effect of channel geometry and wetting properties was investigated through two-phase water-air flow in square and round microchannels, having three static contact angles of 20, 80 and 105 degrees. Four different flow regimes are observed for a fixed flow rate, this being thought to be caused by the wetting behavior of liquid flowing in the corners as well as the liquid film stability. Film deposition experiments in wetting and non-wetting round microchannels show that a thicker film is deposited for wetting conditions departing from the ideal 0 degrees contact angle. A film thickness dependence with the contact angle theta as well as the Capillary number, in the form h_R ~ Ca^(2/3)/ cos(theta) is inferred from scaling arguments, for contact angles smaller than 36 degrees. Non-wetting film deposition experiments reveal that a film significantly thicker than the wetting Bretherton film is deposited. A hydraulic jump occurs if critical conditions are met, as given by a proposed nondimensional parameter similar to the Froude number. Film thickness correlations are also found by matching the measured and the proposed velocity derived in the shock theory. The surface wetting as well as the presence of the shock cause morphological changes in the Taylor bubble flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information management and geoinformation systems (GIS) have become indispensable in a large majority of protected areas all over the world. These tools are used for management purposes as well as for research and in recent years have become even more important for visitor information, education and communication. This study is divided into two parts: the first part provides a general overview of GIS and information management in a selected number of national park organizations. The second part lists and evaluates the needs of evolving large protected areas in Switzerland. The results show a wide use of GIS and information management tools in well established protected areas. The more isolated use of singular GIS tools has increasingly been replaced by an integrated geoinformation management. However, interview partners pointed out that human resources for GIS in most parks are limited. The interviews also highlight uneven access to national geodata. The view of integrated geoinformation management is not yet fully developed in the park projects in Switzerland. Short-term needs, such as software and data availability, motivate a large number of responses collected within an exhaustive questionnaire. Nevertheless, the need for coordinated action has been identified and should be followed up. The park organizations in North America show how an effective coordination and cooperation might be organized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software must be constantly adapted to changing requirements. The time scale, abstraction level and granularity of adaptations may vary from short-term, fine-grained adaptation to long-term, coarse-grained evolution. Fine-grained, dynamic and context-dependent adaptations can be particularly difficult to realize in long-lived, large-scale software systems. We argue that, in order to effectively and efficiently deploy such changes, adaptive applications must be built on an infrastructure that is not just model-driven, but is both model-centric and context-aware. Specifically, this means that high-level, causally-connected models of the application and the software infrastructure itself should be available at run-time, and that changes may need to be scoped to the run-time execution context. We first review the dimensions of software adaptation and evolution, and then we show how model-centric design can address the adaptation needs of a variety of applications that span these dimensions. We demonstrate through concrete examples how model-centric and context-aware designs work at the level of application interface, programming language and runtime. We then propose a research agenda for a model-centric development environment that supports dynamic software adaptation and evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many reverse engineering approaches have been developed to analyze software systems written in different languages like C/C++ or Java. These approaches typically rely on a meta-model, that is either specific for the language at hand or language independent (e.g. UML). However, one language that was hardly addressed is Lisp. While at first sight it can be accommodated by current language independent meta-models, Lisp has some unique features (e.g. macros, CLOS entities) that are crucial for reverse engineering Lisp systems. In this paper we propose a suite of new visualizations that reveal the special traits of the Lisp language and thus help in understanding complex Lisp systems. To validate our approach we apply them on several large Lisp case studies, and summarize our experience in terms of a series of recurring visual patterns that we have detected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial software systems are large and complex, both in terms of the software entities and their relationships. Consequently, understanding how a software system works requires the ability to pose queries over the design-level entities of the system. Traditionally, this task has been supported by simple tools (e.g., grep) combined with the programmer's intuition and experience. Recently, however, specialized code query technologies have matured to the point where they can be used in industrial situations, providing more intelligent, timely, and efficient responses to developer queries. This working session aims to explore the state of the art in code query technologies, and discover new ways in which these technologies may be useful in program comprehension. The session brings together researchers and practitioners. We survey existing techniques and applications, trying to understand the strengths and weaknesses of the various approaches, and sketch out new frontiers that hold promise.