882 resultados para software management infrastructure
Resumo:
Nowadays there is a huge evolution in the technological world and in the wireless networks. The electronic devices have more capabilities and resources over the years, which makes the users more and more demanding. The necessity of being connected to the global world leads to the arising of wireless access points in the cities to provide internet access to the people in order to keep the constant interaction with the world. Vehicular networks arise to support safety related applications and to improve the traffic flow in the roads; however, nowadays they are also used to provide entertainment to the users present in the vehicles. The best way to increase the utilization of the vehicular networks is to give to the users what they want: a constant connection to the internet. Despite of all the advances in the vehicular networks, there were several issues to be solved. The presence of dedicated infrastructure to vehicular networks is not wide yet, which leads to the need of using the available Wi-Fi hotspots and the cellular networks as access networks. In order to make all the management of the mobility process and to keep the user’s connection and session active, a mobility protocol is needed. Taking into account the huge number of access points present at the range of a vehicle for example in a city, it will be beneficial to take advantage of all available resources in order to improve all the vehicular network, either to the users and to the operators. The concept of multihoming allows to take advantage of all available resources with multiple simultaneous connections. This dissertation has as objectives the integration of a mobility protocol, the Network-Proxy Mobile IPv6 protocol, with a host-multihoming per packet solution in order to increase the performance of the network by using more resources simultaneously, the support of multi-hop communications, either in IPv6 or IPv4, the capability of providing internet access to the users of the network, and the integration of the developed protocol in the vehicular environment, with the WAVE, Wi-Fi and cellular technologies. The performed tests focused on the multihoming features implemented on this dissertation, and on the IPv4 network access for the normal users. The obtained results show that the multihoming addition to the mobility protocol improves the network performance and provides a better resource management. Also, the results show the correct operation of the developed protocol in a vehicular environment.
Resumo:
Dissertação apresentada ao Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Desenvolvimento de Software e Sistemas Interativos, realizada sob a orientação científica do Doutor Fernando Reinaldo Silva Garcia Ribeiro e do Doutor José Carlos Meireles Monteiro Metrôlho, Professores Adjuntos da Unidade Técnico-Científica de Informática da Escola Superior de Tecnologia do Instituto Politécnico de Castelo Branco.
Resumo:
The intensive character in knowledge of software production and its rising demand suggest the need to establish mechanisms to properly manage the knowledge involved in order to meet the requirements of deadline, costs and quality. The knowledge capitalization is a process that involves from identification to evaluation of the knowledge produced and used. Specifically, for software development, capitalization enables easier access, minimize the loss of knowledge, reducing the learning curve, avoid repeating errors and rework. Thus, this thesis presents the know-Cap, a method developed to organize and guide the capitalization of knowledge in software development. The Know-Cap facilitates the location, preservation, value addition and updating of knowledge, in order to use it in the execution of new tasks. The method was proposed from a set of methodological procedures: literature review, systematic review and analysis of related work. The feasibility and appropriateness of Know-Cap were analyzed from an application study, conducted in a real case, and an analytical study of software development companies. The results obtained indicate the Know- Cap supports the capitalization of knowledge in software development.
Resumo:
Forested areas within cities host a large number of species, responsible for many ecosystem services in urban areas. The biodiversity in these areas is influenced by human disturbances such as atmospheric pollution and urban heat island effect. To ameliorate the effects of these factors, an increase in urban green areas is often considered sufficient. However, this approach assumes that all types of green cover have the same importance for species. Our aim was to show that not all forested green areas are equal in importance for species, but that based on a multi-taxa and functional diversity approach it is possible to value green infrastructure in urban environments. After evaluating the diversity of lichens, butterflies and other-arthropods, birds and mammals in 31 Mediterranean urban forests in south-west Europe (Almada, Portugal), bird and lichen functional groups responsive to urbanization were found. A community shift (tolerant species replacing sensitive ones) along the urbanization gradient was found, and this must be considered when using these groups as indicators of the effect of urbanization. Bird and lichen functional groups were then analyzed together with the characteristics of the forests and their surroundings. Our results showed that, contrary to previous assumptions, vegetation density and more importantly the amount of urban areas around the forest (matrix), are more important for biodiversity than forest quantity alone. This indicated that not all types of forested green areas have the same importance for biodiversity. An index of forest functional diversity was then calculated for all sampled forests of the area. This could help decision-makers to improve the management of urban green infrastructures with the goal of increasing functionality and ultimately ecosystem services in urban areas.
Resumo:
Tese (doutorado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2015.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
New technologies appear each moment and its use can result in countless benefits for that they directly use and for all the society as well. In this direction, the State also can use the technologies of the information and communication to improve the level of rendering of services to the citizens, to give more quality of life to the society and to optimize the public expense, centering it in the main necessities. For this, it has many research on politics of Electronic Government (e-Gov) and its main effect for the citizen and the society as a whole. This research studies the concept of Electronic Government and wishes to understand the process of implementation of Free Softwares in the agencies of the Direct Administration in the Rio Grande do Norte. Moreover, it deepens the analysis to identify if its implantation results in reduction of cost for the state treasury and intends to identify the Free Software participation in the Administration and the bases of the politics of Electronic Government in this State. Through qualitative interviews with technologies coordinators and managers in 3 State Secretaries it could be raised the ways that come being trod for the Government in order to endow the State with technological capacity. It was perceived that the Rio Grande do Norte still is an immature State in relation to practical of electronic government (e-Gov) and with Free Softwares, where few agencies have factual and viable initiatives in this area. It still lacks of a strategical definition of the paper of Technology and more investments in infrastructure of staff and equipment. One also observed advances as the creation of the normative agency, the CETIC (State Advice of Technology of the Information and Communication), the Managing Plan of Technology that provide a necessary diagnosis with the situation how much Technology in the State and considered diverse goals for the area, the accomplishment of a course of after-graduation for managers of Technology and the training in BrOffice (OppenOffice) for 1120 public servers
Resumo:
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Resumo:
Independientemente de la metodología que se adopte en el desarrollo de software, se contemplan aquellas actividades gerenciales o de dirección del proyecto y las inherentes a las técnicas, propias del desarrollo del producto como tal, como los requerimientos demandados, análisis, diseño, implementación y pruebas o ensayos previos a su materialización -- El presente trabajo se deriva del interés por diseñar una metodología para la gestión de la fase de pruebas y ensayo, con base en el modelo de integración de las actividades contempladas en la guía del PMBOK, la cual es compatible con las funciones de dirección y actividades técnicas de otras metodologías, especialmente en su etapa de prueba; de allí la importancia que representa para los gerentes de proyectos obtener resultados satisfactorios en esta fase, por su impacto directo y significativo en el cumplimiento del tiempo y los costos estimados, lo que permite prevenir o mitigar, tiempos adicionales o sobrecostos por reproceso, evitando ser transferidos al cliente o asumidos por el fabricante de software -- Así mismo, asegurar una ejecución correcta de la fase de pruebas y ensayo, garantiza que el proyecto responda a los estándares de calidad, de acuerdo con sus indicadores de medición y la satisfacción del usuario
Resumo:
significant amount of Expendable Bathythermograph (XBT) data has been collected in the Mediterranean Sea since 1999 in the framework of operational oceanography activities. The management and storage of such a volume of data poses significant challenges and opportunities. The SeaDataNet project, a pan-European infrastructure for marine data diffusion, provides a convenient way to avoid dispersion of these temperature vertical profiles and to facilitate access to a wider public. The XBT data flow, along with the recent improvements in the quality check procedures and the consistence of the available historical data set are described. The main features of SeaDataNet services and the advantage of using this system for long-term data archiving are presented. Finally, focus on the Ligurian Sea is included in order to provide an example of the kind of information and final products devoted to different users can be easily derived from the SeaDataNet web portal.
Resumo:
Part 14: Interoperability and Integration
Resumo:
Over the past 15 years, the number of international development projects aimed at combating global poverty has increased significantly. Within the water and sanitation sector however, and despite heightened global attention and an increase in the number of infrastructure projects, over 800 million people remain without access to appropriate water and sanitation facilities. The majority of donor aid in the water supply and sanitation sector of developing countries is delivered through standalone projects. The quality of projects at the design and preparation stage is a critical determinant in meeting project objectives. The quality of projects at early stage of design, widely referred to as quality at entry (QAE), however remains unquantified and largely subjective. This research argues that water and sanitation infrastructure projects in the developing world tend to be designed in the absence of a specific set of actions that ensure high QAE, and consequently have relatively high rates of failure. This research analyzes 32 cases of water and sanitation infrastructure projects implemented with partial or full World Bank financing globally from 2000 – 2010. The research uses categorical data analysis, regression analysis and descriptive analysis to examine perceived linkages between project QAE and project development outcomes and determines which upstream project design factors are likely to impact the QAE of international development projects in water supply and sanitation. The research proposes a number of specific design stage actions that can be incorporated into the formal review process of water and sanitation projects financed by the World Bank or other international development partners.
Resumo:
Studies of strategic HRM have dominated HRM research over the last three decades. Focusing on the HRM-organisation performance relationship, researchers take various themes and perspectives in their approach to strategic HRM. Among these themes, two contrasting approaches of strategic HRM continue to flourish: first, the best practice approach suggests that certain HRM practices will have the same effect irrespective of context and, second, the best fit approach suggests that the choice of HRM practices should be designed in accordance with an organisations’ specific context. While there is little consensus on what constitutes strategic HRM, the most common feature agreed in this field is the notion of the strategic integration; aligning HRM practices with organisations’ overall strategic objectives (vertical fit) and with each other (horizontal fit). Utilising the best fit approach as its theoretical framework, this study examines how vertical and horizontal fit is practised in the Indonesian civil service and what factors likely influence the prevalence of vertical and horizontal fit in the Indonesian civil service context. This study is significant for two important reasons. Firstly, the literature suggests that there are limited studies examining the best fit concept in the civil sector despite its implementation in the private sector positively contributing to organisational performance improvement. Secondly, the study provides enlightenment on how the best fit approach could contribute to performance improvement in the Indonesian civil service. This is in line with the fact that negative images of the Indonesian civil service are continuously highlighted although various HRM reform initiatives have been put in place. To achieve the objectives of the study, the qualitative case study approach accompanied by semi-structured interviews was employed involving 53 senior officials and one focus group discussion from eight Indonesian government agencies, consisting of three central agencies mandated to manage human resources, the National Bureaucratic Reform Team and four line agencies from both central and local governments. Thematic analysis was employed for data analyses and NVIVO software was used to manage the data. The study suggests three main findings. First, various HRM initiatives in relation to the HRM reform have been introduced in the Indonesian civil service differentiating them from the old HRM practices. However, the findings indicate that some HRM policies are still contradicting and hinder vertical and horizontal fit. Second, despite the contradictory policies, vertical and horizontal fit can be seen in the line agencies which have been acknowledged as ‘reformed agencies’. This demonstrates that the line agencies play an important role in aligning HRM practices with the line agencies’ goals and objectives and with one another although they are bounded by HRM policies that are unlikely to support the vertical and horizontal fit concept. Third, factors influencing the prevalence of vertical and horizontal fit include knowledge of contemporary HRM in both central agencies and line agencies, commitment from the line agencies’ leaders, devolvement of HRM to the line agencies and the socio-political and economic environments of the Indonesian civil service. The findings of the study raise policy, practical and theoretical implications. In terms of policy implications, the study highlights the importance of fit in HRM policies to support the achievement of the line agencies’ goals. Therefore, when formulating an HRM policy, the central agencies need to ensure that the HRM policy is linked to line agencies’ goals and to other HRM policies. This is to ensure synchronisation among the policies and thus maximising the achievement of the line agencies’ goals. From the practical perspectives, the study highlights important points which can be learned by the central agencies in carrying out their strategic role with regard to the formulation of HRM policies; by the line agencies in maximising the contribution of HRM to the achievement of the goals and objectives of the agencies through the implementation of the best fit concept, and by the leaders of the agencies in providing continuous support to each of the involved parties in the line agencies and involving the HRM department in all agency’s strategic decision-making. In relation to the theoretical implication, it is clear that the best fit approach is not thoroughly applied due to factors discussed previously. However, this does not mean that the best fit concept cannot be implemented. As argued by McCourt & Ramgutty-Wong (2003), instead of adopting the whole concept of best fit, a modulated approach reflecting the best fit concept, such as selecting individual HRM practices and experimenting with devolution, is possible for civil service organisations which still embrace centralised HRM systems. As demonstrated in the findings, some of the line agencies being studied seem to be ready to adopt the best fit approach given that they have knowledge of the best fit concept, strong support from the top leader, less political intervention and less corruption, collusion, and nepotism practices in their HRM practices.
Resumo:
The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.