49 resultados para Computing and software systems
Resumo:
Location-awareness indoors will be an inseparable feature of mobile services/applications in future wireless networks. Its current ubiquitous availability is still obstructed by technological challenges and privacy issues. We propose an innovative approach towards the concept of indoor positioning with main goal to develop a system that is self-learning and able to adapt to various radio propagation environments. The approach combines estimation of propagation conditions, subsequent appropriate channel modelling and optimisation feedback to the used positioning algorithm. Main advantages of the proposal are decreased system set-up effort, automatic re-calibration and increased precision.
Resumo:
Information management and geoinformation systems (GIS) have become indispensable in a large majority of protected areas all over the world. These tools are used for management purposes as well as for research and in recent years have become even more important for visitor information, education and communication. This study is divided into two parts: the first part provides a general overview of GIS and information management in a selected number of national park organizations. The second part lists and evaluates the needs of evolving large protected areas in Switzerland. The results show a wide use of GIS and information management tools in well established protected areas. The more isolated use of singular GIS tools has increasingly been replaced by an integrated geoinformation management. However, interview partners pointed out that human resources for GIS in most parks are limited. The interviews also highlight uneven access to national geodata. The view of integrated geoinformation management is not yet fully developed in the park projects in Switzerland. Short-term needs, such as software and data availability, motivate a large number of responses collected within an exhaustive questionnaire. Nevertheless, the need for coordinated action has been identified and should be followed up. The park organizations in North America show how an effective coordination and cooperation might be organized.
Resumo:
Software must be constantly adapted to changing requirements. The time scale, abstraction level and granularity of adaptations may vary from short-term, fine-grained adaptation to long-term, coarse-grained evolution. Fine-grained, dynamic and context-dependent adaptations can be particularly difficult to realize in long-lived, large-scale software systems. We argue that, in order to effectively and efficiently deploy such changes, adaptive applications must be built on an infrastructure that is not just model-driven, but is both model-centric and context-aware. Specifically, this means that high-level, causally-connected models of the application and the software infrastructure itself should be available at run-time, and that changes may need to be scoped to the run-time execution context. We first review the dimensions of software adaptation and evolution, and then we show how model-centric design can address the adaptation needs of a variety of applications that span these dimensions. We demonstrate through concrete examples how model-centric and context-aware designs work at the level of application interface, programming language and runtime. We then propose a research agenda for a model-centric development environment that supports dynamic software adaptation and evolution.
Resumo:
Many reverse engineering approaches have been developed to analyze software systems written in different languages like C/C++ or Java. These approaches typically rely on a meta-model, that is either specific for the language at hand or language independent (e.g. UML). However, one language that was hardly addressed is Lisp. While at first sight it can be accommodated by current language independent meta-models, Lisp has some unique features (e.g. macros, CLOS entities) that are crucial for reverse engineering Lisp systems. In this paper we propose a suite of new visualizations that reveal the special traits of the Lisp language and thus help in understanding complex Lisp systems. To validate our approach we apply them on several large Lisp case studies, and summarize our experience in terms of a series of recurring visual patterns that we have detected.
Resumo:
Industrial software systems are large and complex, both in terms of the software entities and their relationships. Consequently, understanding how a software system works requires the ability to pose queries over the design-level entities of the system. Traditionally, this task has been supported by simple tools (e.g., grep) combined with the programmer's intuition and experience. Recently, however, specialized code query technologies have matured to the point where they can be used in industrial situations, providing more intelligent, timely, and efficient responses to developer queries. This working session aims to explore the state of the art in code query technologies, and discover new ways in which these technologies may be useful in program comprehension. The session brings together researchers and practitioners. We survey existing techniques and applications, trying to understand the strengths and weaknesses of the various approaches, and sketch out new frontiers that hold promise.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
In rapidly evolving domains such as Computer Assisted Orthopaedic Surgery (CAOS) emphasis is often put first on innovation and new functionality, rather than in developing the common infrastructure needed to support integration and reuse of these innovations. In fact, developing such an infrastructure is often considered to be a high-risk venture given the volatility of such a domain. We present CompAS, a method that exploits the very evolution of innovations in the domain to carry out the necessary quantitative and qualitative commonality and variability analysis, especially in the case of scarce system documentation. We show how our technique applies to the CAOS domain by using conference proceedings as a key source of information about the evolution of features in CAOS systems over a period of several years. We detect and classify evolution patterns to determine functional commonality and variability. We also identify non-functional requirements to help capture domain variability. We have validated our approach by evaluating the degree to which representative test systems can be covered by the common and variable features produced by our analysis.
Resumo:
The IDE used in most Smalltalk dialects such as Pharo, Squeak or Cincom Smalltalk did not evolve significantly over the last years, if not to say decades. For other languages, for instance Java, the available IDEs made tremendous progress as Eclipse or NetBeans illustrate. While the Smalltalk IDE served as an exemplar for many years, other IDEs caught up or even overtook the erstwhile leader in terms of feature-richness, usability, or code navigation facilities. In this paper we first analyze the difficulty of software navigation in the Smalltalk IDE and second illustrate with concrete examples the features we added to the Smalltalk IDE to fill the gap to modern IDEs and to provide novel, improved means to navigate source space. We show that thanks to the agility and dynamics of Smalltalk, we are able to extend and enhance with reasonable effort the Smalltalk IDE to better support software navigation, program comprehension, and software maintenance in general. One such support is the integration of dynamic information into the static source views we are familiar with. Other means include easing the access to static information (for instance by better arranging important packages) or helping developers re-locating artifacts of interest (for example with a categorization system such as smart groups).
Resumo:
In addition to multi-national Grid infrastructures, several countries operate their own national Grid infrastructures to support science and industry within national borders. These infrastructures have the benefit of better satisfying the needs of local, regional and national user communities. Although Switzerland has strong research groups in several fields of distributed computing, only recently a national Grid effort was kick-started to integrate a truly heterogeneous set of resource providers, middleware pools, and users. In the following. article we discuss our efforts to start Grid activities at a national scale to combine several scientific communities and geographical domains. We make a strong case for the need of standards that have to be built on top of existing software systems in order to provide support for a heterogeneous Grid infrastruc
Resumo:
The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.
Resumo:
Stata is a general purpose software package that has become popular among various disciplines such as epidemiology, economics, or social sciences. Users like Stata for its scientific approach, its robustness and reliability, and the ease with which its functionality can be extended by user written programs. In this talk I will first give a brief overview of the functionality of Stata and then discuss two specific features: survey estimation and predictive margins/marginal effects. Most surveys are based on complex samples that contain multiple sampling stages, are stratified or clustered, and feature unequal selection probabilities. Standard estimators can produce misleading results in such samples unless the peculiarities of the sampling plan are taken into account. Stata offers survey statistics for complex samples for a wide variety of estimators and supports several variance estimation procedures such as linearization, jackknife, and balanced repeated replication (see Kreuter and Valliant, 2007, Stata Journal 7: 1-21). In the talk I will illustrate these features using applied examples and I will also show how user written commands can be adapted to support complex samples. Complex can also be the models we fit to our data, making it difficult to interpret them, especially in case of nonlinear or non-additive models (Mood, 2010, European Sociological Review 26: 67-82). Stata provides a number of highly useful commands to make results of such models accessible by computing and displaying predictive margins and marginal effects. In my talk I will discuss these commands provide various examples demonstrating their use.
Resumo:
The main aim of this article is to shed light on the extent to which differences in higher education participation between people with and without a migrant background of low/higher social origin can be explained by two macro-level characteristics of national educational institutions: stratification of the secondary school system and provision of alternative access to higher education. General assumptions are that people with a migrant background of low social origin benefit in low-stratified secondary school systems and in systems that provide alternative access to institutions of higher education more than their native peers in the same social stratum, owing to primary and secondary effects of migrant background. Database is a pooled dataset of the five waves of the European Social Survey. Results of logistic multi-level analyses indicate that a low-stratified secondary school system improves the probability of people with a migrant background/low social origin attaining a higher education degree. On the other hand, a stratified secondary school system reduces their chances regarding this educational stage. The provision of alternative access to an institution of higher education improves their likelihood of becoming higher education graduates.
Resumo:
Mobile ad-hoc networks (MANETs) and wireless sensor networks (WSNs) have been attracting increasing attention for decades due to their broad civilian and military applications. Basically, a MANET or WSN is a network of nodes connected by wireless communication links. Due to the limited transmission range of the radio, many pairs of nodes in MANETs or WSNs may not be able to communicate directly, hence they need other intermediate nodes to forward packets for them. Routing in such types of networks is an important issue and it poses great challenges due to the dynamic nature of MANETs or WSNs. On the one hand, the open-air nature of wireless environments brings many difficulties when an efficient routing solution is required. The wireless channel is unreliable due to fading and interferences, which makes it impossible to maintain a quality path from a source node to a destination node. Additionally, node mobility aggravates network dynamics, which causes frequent topology changes and brings significant overheads for maintaining and recalculating paths. Furthermore, mobile devices and sensors are usually constrained by battery capacity, computing and communication resources, which impose limitations on the functionalities of routing protocols. On the other hand, the wireless medium possesses inherent unique characteristics, which can be exploited to enhance transmission reliability and routing performance. Opportunistic routing (OR) is one promising technique that takes advantage of the spatial diversity and broadcast nature of the wireless medium to improve packet forwarding reliability in multihop wireless communication. OR combats the unreliable wireless links by involving multiple neighboring nodes (forwarding candidates) to choose packet forwarders. In opportunistic routing, a source node does not require an end-to-end path to transmit packets. The packet forwarding decision is made hop-by-hop in a fully distributed fashion. Motivated by the deficiencies of existing opportunistic routing protocols in dynamic environments such as mobile ad-hoc networks or wireless sensor networks, this thesis proposes a novel context-aware adaptive opportunistic routing scheme. Our proposal selects packet forwarders by simultaneously exploiting multiple types of cross-layer context information of nodes and environments. Our approach significantly outperforms other routing protocols that rely solely on a single metric. The adaptivity feature of our proposal enables network nodes to adjust their behaviors at run-time according to network conditions. To accommodate the strict energy constraints in WSNs, this thesis integrates adaptive duty-cycling mechanism to opportunistic routing for wireless sensor nodes. Our approach dynamically adjusts the sleeping intervals of sensor nodes according to the monitored traffic load and the estimated energy consumption rate. Through the integration of duty cycling of sensor nodes and opportunistic routing, our protocol is able to provide a satisfactory balance between good routing performance and energy efficiency for WSNs.
Resumo:
PURPOSE To develop a method for computing and visualizing pressure differences derived from time-resolved velocity-encoded three-dimensional phase-contrast magnetic resonance imaging (4D flow MRI) and to compare pressure difference maps of patients with unrepaired and repaired aortic coarctation to young healthy volunteers. METHODS 4D flow MRI data of four patients with aortic coarctation either before or after repair (mean age 17 years, age range 3-28, one female, three males) and four young healthy volunteers without history of cardiovascular disease (mean age 24 years, age range 20-27, one female, three males) was acquired using a 1.5-T clinical MR scanner. Image analysis was performed with in-house developed image processing software. Relative pressures were computed based on the Navier-Stokes equation. RESULTS A standardized method for intuitive visualization of pressure difference maps was developed and successfully applied to all included patients and volunteers. Young healthy volunteers exhibited smooth and regular distribution of relative pressures in the thoracic aorta at mid systole with very similar distribution in all analyzed volunteers. Patients demonstrated disturbed pressures compared to volunteers. Changes included a pressure drop at the aortic isthmus in all patients, increased relative pressures in the aortic arch in patients with residual narrowing after repair, and increased relative pressures in the descending aorta in a patient after patch aortoplasty. CONCLUSIONS Pressure difference maps derived from 4D flow MRI can depict alterations of spatial pressure distribution in patients with repaired and unrepaired aortic coarctation. The technique might allow identifying pathophysiological conditions underlying complications after aortic coarctation repair.
Resumo:
Commoditization and virtualization of wireless networks are changing the economics of mobile networks to help network providers (e.g., MNO, MVNO) move from proprietary and bespoke hardware and software platforms toward an open, cost-effective, and flexible cellular ecosystem. In addition, rich and innovative local services can be efficiently created through cloudification by leveraging the existing infrastructure. In this work, we present RANaaS, which is a cloudified radio access network delivered as a service. RANaaS provides the service life-cycle of an ondemand, elastic, and pay as you go 3GPP RAN instantiated on top of the cloud infrastructure. We demonstrate an example of realtime cloudified LTE network deployment using the OpenAirInterface LTE implementation and OpenStack running on commodity hardware as well as the flexibility and performance of the platform developed.