832 resultados para Deployment of Federal Institutes
Resumo:
The paper gives an overview about the ongoing FP6-IST INFRAWEBS project and describes the main layers and software components embedded in an application oriented realisation framework. An important part of INFRAWEBS is a Semantic Web Unit (SWU) – a collaboration platform and interoperable middleware for ontology-based handling and maintaining of SWS. The framework provides knowledge about a specific domain and relies on ontologies to structure and exchange this knowledge to semantic service development modules. INFRAWEBS Designer and Composer are sub-modules of SWU responsible for creating Semantic Web Services using Case-Based Reasoning approach. The Service Access Middleware (SAM) is responsible for building up the communication channels between users and various other modules. It serves as a generic middleware for deployment of Semantic Web Services. This software toolset provides a development framework for creating and maintaining the full-life-cycle of Semantic Web Services with specific application support.
Resumo:
This paper considers the problem of finding an optimal deployment of information resources on an InfoStation network in order to minimize the overhead and reduce the time needed to satisfy user requests for resources. The RG-Optimization problem and an approach for its solving are presented as well.
Resumo:
The paper explores the functionalities of eight start pages and considers their usefulness when used as a mashable platform for deployment of personal learning environments (PLE) for self-organized learners. The Web 2.0 effects and eLearning 2.0 strategies are examined from the point of view of how they influence the methods of gathering and capturing data, information and knowledge, and the learning process. Mashup technology is studied in order to see what kind of components can be used in PLE realization. A model of a PLE for self-organized learners is developed and it is used to prototype a personal learning and research environment in the start pages Netvibes, Pageflakes and iGoogle.
Resumo:
ACM Computing Classification System (1998): H3.3, H.5.5, J5.
Resumo:
The present research represents a coherent approach to understanding the root causes of ethnic group differences in ability test performance. Two studies were conducted, each of which was designed to address a key knowledge gap in the ethnic bias literature. In Study 1, both the LR Method of Differential Item Functioning (DIF) detection and Mixture Latent Variable Modelling were used to investigate the degree to which Differential Test Functioning (DTF) could explain ethnic group test performance differences in a large, previously unpublished dataset. Though mean test score differences were observed between a number of ethnic groups, neither technique was able to identify ethnic DTF. This calls into question the practical application of DTF to understanding these group differences. Study 2 investigated whether a number of non-cognitive factors might explain ethnic group test performance differences on a variety of ability tests. Two factors – test familiarity and trait optimism – were able to explain a large proportion of ethnic group test score differences. Furthermore, test familiarity was found to mediate the relationship between socio-economic factors – particularly participant educational level and familial social status – and test performance, suggesting that test familiarity develops over time through the mechanism of exposure to ability testing in other contexts. These findings represent a substantial contribution to the field’s understanding of two key issues surrounding ethnic test performance differences. The author calls for a new line of research into these performance facilitating and debilitating factors, before recommendations are offered for practitioners to ensure fairer deployment of ability testing in high-stakes selection processes.
Resumo:
Radio Frequency Identification Technology (RFID) adoption in healthcare settings has the potential to reduce errors, improve patient safety, streamline operational processes and enable the sharing of information throughout supply chains. RFID adoption in the English NHS is limited to isolated pilot studies. Firstly, this study investigates the drivers and inhibitors to RFID adoption in the English NHS from the perspective of the GS1 Healthcare User Group (HUG) tasked with coordinating adoption across private and public sectors. Secondly a conceptual model has been developed and deployed, combining two of foresight’s most popular methods; scenario planning and technology roadmapping. The model addresses the weaknesses of each foresight technique as well as capitalizing on their individual, inherent strengths. Semi structured interviews, scenario planning workshops and a technology roadmapping exercise were conducted with the members of the HUG over an 18-month period. An action research mode of enquiry was utilized with a thematic analysis approach for the identification and discussion of the drivers and inhibitors of RFID adoption. The results of the conceptual model are analysed in comparison to other similar models. There are implications for managers responsible for RFID adoption in both the NHS and its commercial partners, and for foresight practitioners. Managers can leverage the insights gained from identifying the drivers and inhibitors to RFID adoption by making efforts to influence the removal of inhibitors and supporting the continuation of the drivers. The academic contribution of this aspect of the thesis is in the field of RFID adoption in healthcare settings. Drivers and inhibitors to RFID adoption in the English NHS are compared to those found in other settings. The implication for technology foresight practitioners is a proof of concept of a model combining scenario planning and technology roadmapping using a novel process. The academic contribution to the field of technology foresight is the conceptual development of foresight model that combines two popular techniques and then a deployment of the conceptual foresight model in a healthcare setting exploring the future of RFID technology.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^
Landscape of fear: A social history of the missile during the early years of the Cold War, 1950–1965
Resumo:
The missile's significance has been central to national security since the Soviet launching of Sputnik, and became increasingly important throughout the years of the Cold War. Much has been written about missile technology, but little has been written about how the development and deployment of this weapon affected Americans. The missile was developed to both deter war but also to win war. Its presence, however, was not always reassuring. Three areas of the United States are studied to evaluate the social implications of the missile during these pivotal years: San Francisco, home of multiple Nike installations; of Cape Canaveral, Florida, the nation's primary missile test center; the Great Plains, the location of the largest ICBM concentration in the country. Interviews were conducted, tours of facilities were taken, and local newspapers were reviewed. In conjunction with national newspapers and magazines and public opinion polls, this information provided a local social context for missile history. Nationally and locally, Americans both feared and praised the new technology. They were anxious for government funding in their cities and often felt that the danger the missile brought to their communities by making it as a Soviet target was justified in the larger cause for national security.
Resumo:
Over the past 200 years, an estimated 53% (about 47 million ha) of the original wetlands in the conterminous United States have been lost, mainly as a result of various human activities. Despite the importance of wetlands (particularly along the coast), and a longstanding federal policy framework meant to protect their integrity, the cumulative impact on these natural systems over large areas is poorly understood. We address this lack of research by mapping and conducting descriptive spatial analyses of federal wetland alteration permits (pursuant to section 404 of the Clean Water Act) across 85 watersheds in Florida and coastal Texas from 1991 to 2003. Results show that more than half of the permits issued in both states (60%) fell under the Nationwide permitting category. Permits issued in Texas were typically located outside of urban areas (78%) and outside 100-year floodplains (61%). More than half of permits issued in Florida were within urban areas (57%) and outside of 100-year floodplains (51%). The most affected wetlands types were estuarine in Texas (47%) and palustrine in Florida (55%). We expect that an additional outcome of this work will be an increased awareness of the cumulative depletion of wetlands and loss of ecological services in these urbanized areas, perhaps leading to increased conservation efforts.
Resumo:
The missile's significance has been central to national security since the Soviet launching of Sputnik, and became increasingly important throughout the years of the Cold War. Much has been written about missile technology, but little has been written about how the development and deployment of this weapon affected Americans. The missile was developed to both deter war but also to win war. Its presence, however, was not always reassuring. Three areas of the United States are studied to evaluate the social implications of the missile during these pivotal years: San Francisco, home of multiple Nike installations; of Cape Canaveral, Florida, the nation's primary missile test center; the Great Plains, the location of the largest ICBM concentration in the country. Interviews were conducted, tours of facilities were taken, and local newspapers were reviewed. In conjunction with national newspapers and magazines and public opinion polls, this information provided a local social context for missile history. Nationally and locally, Americans both feared and praised the new technology. They were anxious for government funding in their cities and often felt that the danger the missile brought to their communities by making it as a Soviet target was justified in the larger cause for national security.
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. This thesis describes a heterogeneous database system being developed at Highperformance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i.) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii.) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii.) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv.) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v.) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi.) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii.) a framework for intelligent computing and communication on the Internet applying the concepts of our work.
Resumo:
In this dissertation we propose a Teaching Unit of Physics to teach content through environmental discussions of the greenhouse effect and global warming. This teaching unit is based on a problem-methodological intervention from the application of the method of the Arch of Charles Maguerez. The methodological foundations of the thesis are embedded in action research and this is structured in five chapters: the first chapter deals with the Physical Environment (FMA) as a subject in Degree Courses in Physics in Brazil, bringing the concern of how this discipline has been taught. We started the first chapter explaining the reasons behind the inclusion of the discipline of Physical Environment in a Physics Degree Courses. Then we did a search on the websites of Institutions of Higher Education, to know of the existence or not of this discipline on curricular. We then analyzed the menus to see what bibliographies are being adopted and what content of Physics are being worked, and how it has been done. The courses surveyed were those of Federal and Federal Institutes Universities. Thus ended the first chapter. Given the inseparability between studies in Physics Teaching and studies on competencies, skills and significant learning, wrote the second chapter. In this chapter we discuss the challenge of converting information into knowledge. Initially on initial teacher training, because even if this is not our focus, the study is a discipline on the upper reaches, therefore, offered to future teachers. Then we talked about the culture of knowledge, where we emphasize the use of a teaching approach that promotes meanings taught by content and make sense to the student. We finished the third chapter, making some considerations on skills and abilities, in order to identify what skills and competencies were developed and worked during and after the implementation of Curriculum Unit. The third chapter is the result of a literature review and study of the radioactive EarthSun interaction. The subjects researched approach from the generation of energy in the sun to topics stain solar coronal mass ejections, solar wind, black body radiation, Wien displacement law, Stefan-Boltzmann Law, greenhouse effect and global warming. This chapter deals with material support for the teacher of the aforementioned discipline. The fourth chapter talks about the arc method of Charles Maguerez; Here we explain the structure of each of the five steps of the Arc and how to use them in teaching. We also show another version of this method adapted by Bordenave. In the fifth and final chapter brought a description of how the method of Arc was used in physics classes of Environment, with students majoring in Physics IFRN Campus Santa Cruz. Here, in this chapter, a transcript of classes to show how was the application of a problem-based methodology in the teaching of content proposed for Physics Teaching Unit from the environmental discussion about the greenhouse effect and global warming phenomena
Resumo:
Bioenergy is now accepted as having the potential to provide the major part of the projected renewable energy provisions of the future as biofuels in the form of gas, liquid or solid fuels or electricity and heat. There are three main routes to providing these biofuels — thermal conversion, biological conversion and physical conversion — all of which employ a range of chemical reactor configurations and process designs. This paper focuses on fast pyrolysis from which the liquid, often referred to as bio-oil, can be used on-site or stored or transported to centralised and/or remote user facilities for utilisation for example as a fuel, or further processing to biofuels and/or chemicals. This offers the potential for system optimisation, much greater economies of scale and exploitation of the concepts of biorefineries. The technology of fast pyrolysis is described, particularly the reactors that have been developed to provide the necessary conditions to optimise performance. The primary liquid product is characterised, as well as the secondary products of electricity and/or heat, liquid fuels and a considerable number of chemicals. The main technical and non-technical barriers to the market deployment of the various technologies are identified and briefly discussed.
Resumo:
The use of structural health monitoring of civil structures is ever expanding and by assessing the dynamical condition of structures, informed maintenance management can be conducted at both individual and network levels. With the continued growth of information age technology, the potential arises for smart monitoring systems to be integrated with civil infrastructure to provide efficient information on the condition of a structure. The focus of this thesis is the integration of smart technology with civil infrastructure for the purposes of structural health monitoring. The technology considered in this regard are devices based on energy harvesting materials. While there has been considerable focus on the development and optimisation of such devices using steady state loading conditions, their applications for civil infrastructure are less known. Although research is still in initial stages, studies into the uses associated with such applications are very promising. Through the use of the dynamical response of structures to a variety of loading conditions, the energy harvesting outputs from such devices is established and the potential power output determined. Through a power variance output approach, damage detection of deteriorating structures using the energy harvesting devices is investigated. Further applications of the integration of energy harvesting devices with civil infrastructure investigated by this research includes the use of the power output as a indicator for control. Four approaches are undertaken to determine the potential applications arising from integrating smart technology with civil infrastructure, namely • Theoretical analysis to determine the applications of energy harvesting devices for vibration based health monitoring of civil infrastructure. • Laboratory experimentation to verify the performance of different energy harvesting configurations for civil infrastructure applications. • Scaled model testing as a method to experimentally validate the integration of the energy harvesting devices with civil infrastructure. • Full scale deployment of energy harvesting device with a bridge structure. These four approaches validate the application of energy harvesting technology with civil infrastructure from a theoretical, experimental and practical perspective.