820 resultados para approach to information systems
Resumo:
The elaboration of a generic decision-making strategy to address the evolution of an emergency situation, from the stages of response to recovery, and including a planning stage, can facilitate timely, effective and consistent decision making by the response organisations at every level within the emergency management structure and between countries, helping to ensure optimal protection of health, environment, and society. The degree of involvement of stakeholders in this process is a key strategic element for strengthening the local preparedness and response and can help a successful countermeasures strategy. A significant progress was made with the multi-national European project EURANOS (2004-2009) which brought together best practice, knowledge and technology to enhance the preparedness for Europe's response to any radiation emergency and long term contamination. The subsequent establishment of a European Technology Platform and the recent launch of the research project NERIS-TP ("Towards a self sustaining European Technology Platform (NERIS-TP) on Preparedness for Nuclear and Radiological Emergency Response and Recovery") are aimed to continue with the remaining tasks for gaining appropriate levels of emergency preparedness at local level in most European countries. One of the objectives of the NERIS-TP project is: Strengthen the preparedness at the local/national level by setting up dedicated fora and developing new tools or adapting the tools developed within the EURANOS projects (such as the governance framework for preparedness, the handbooks on countermeasures, the RODOS system, and the MOIRA DSS for long term contamination in catchments) to meet the needs of local communities. CIEMAT and UPM in close interaction with the Nuclear Safety Council will explore, within this project, the use and application in Spain of such technical tools, including other national tools and information and communication strategies to foster cooperation between local, national and international stakeholders. The aim is identify and involve relevant stakeholders in emergency preparedness to improve the development and implementation of appropriate protection strategies as part of the consequence management and the transition to recovery. In this paper, an overview of the "state of the art" on this area in Spain and the methodology and work Plan proposed by the Spanish group within the project NERIS to grow the stakeholder involvement in the preparedness to emergency response and recovery is presented.
Resumo:
One of the main problems in urban areas is the steady growth in car ownership and traffic levels. Therefore, the challenge of sustainability is focused on a shift of the demand for mobility from cars to collective means of transport. For this end, buses are a key element of the public transport systems. In this respect Real Time Passenger Information (RTPI) systems help citizens change their travel behaviour towards more sustainable transport modes. This paper provides an assessment methodology which evaluates how RTPI systems improve the quality of bus services in two European cities, Madrid and Bremerhaven. In the case of Madrid, bus punctuality has increased by 3%. Regarding the travellers perception, Madrid raised its quality of service by 6% while Bremerhaven increased by 13%. On the other hand, the users ́ perception of Public Transport (PT) image increased by 14%.
Resumo:
The presented work proposes a new approach for anomaly detection. This approach is based on changes in a population of evolving agents under stress. If conditions are appropriate, changes in the population (modeled by the bioindicators) are representative of the alterations to the environment. This approach, based on an ecological view, improves functionally traditional approaches to the detection of anomalies. To verify this assertion, experiments based on Network Intrussion Detection Systems are presented. The results are compared with the behaviour of other bioinspired approaches and machine learning techniques.
Resumo:
This paper focuses on identifying and analysing the elements of Strategic Management for infrastructure and engineering assets. These elements are contended to involve an understanding of governance, corporate policy, corporate objectives, corporate strategy and interagency collaboration and will in turn, allow the ability determine a broader and more comprehensive framework for engineering asset management, ie a ‘staged approach’ to understanding how assets are managed within organisations. While the assets themselves have often been the sole concern for good management practices, other social and contextual elements have come into the mix in order to promote strategic asset management. The development of an integrated approach to asset management is at the base of the research question. What are the considerations and implications for adopting and implementing an integrated strategic asset management (ISAM) framework? While operational matters have been given prominence, a subset of corporate governance, Asset Governance, details the policies and processes needed to acquire, utilise, maintain and account for an organisation’s assets. Asset governance stems from the organisation’s overarching corporate governance principles; as a result it defines the management context in which engineering asset management is implemented. This aspect will be examined to determine the appropriate relationship between organisational strategic management and strategic asset management to further the theoretical engagement with the maturity of strategy,policy and governance for infrastructure and engineered assets. Asset governance stems from the organisation’s overarching corporate governance principles; as a result it defines the management context in which engineering asset management is implemented. The research proceeds by a document analysis of corporate reports and policy recommendations in terms of infrastructure and engineered assets. The paper concludes that incorporating an integrated asset management framework can promote a more robust conceptualisation of public assets and how they combine to provide a comprehensive system of service outcomes.
Resumo:
Information Technologies are complex and this is true even in the smallest piece of equipment. But this kind of complexity is nothing comparejwith the one that arises when this technology interact with society. Office Automation has been traditionally considered as a technical field but there is no way to find solutions from a technical point of view when the problems are primarily social in their origin. Technology management has to change its focus from a pure technical perspective to a sociotechnical point of view. To facilitate this change, we propose a model that allows a better understanding between the managerial and the technical world, offering a coherent, complete and integrated perspective of both. The base for this model is an unfolding of the complexity found in information Technologies and a matching of these complexities with several levels considered within the Office, Office Automation and Human Factors dimensions. Each one of these domains is studied trough a set of distinctions that create a new and powerful understanding of its reality. Using this model we build up a map of Office Automation to be use^not only by managers but also by technicians because the primaty advantage of such a framework is that it allows a comprehensive evaluation of technology without requhing extensive technical knowledge. Thus, the model can be seen as principle for design and diagnosis of Office Automation and as a common reference for managers and specialist avoiding the severe limitations arising from the language used by the last
Resumo:
One of the most challenging problems that must be solved by any theoretical model purporting to explain the competence of the human brain for relational tasks is the one related with the analysis and representation of the internal structure in an extended spatial layout of múltiple objects. In this way, some of the problems are related with specific aims as how can we extract and represent spatial relationships among objects, how can we represent the movement of a selected object and so on. The main objective of this paper is the study of some plausible brain structures that can provide answers in these problems. Moreover, in order to achieve a more concrete knowledge, our study will be focused on the response of the retinal layers for optical information processing and how this information can be processed in the first cortex layers. The model to be reported is just a first trial and some major additions are needed to complete the whole vision process.
Resumo:
Bus rapid transit (BRT) systems are massive transport systems with medium/high capacity, high quality service and low infrastructure and operating costs. TransMilenio is Bogotá's most important mass transportation system and one of the biggest BRT systems in the world, although it only has completed its third construction phase out of a total of eight. In this paper we review the proposals in the literature to optimize BRT system operation, with a special emphasis on TransMilenio, and propose a mathematical model that adapts elements of the above proposals and incorporates novel elements accounting for the features of TransMilenio system.
Resumo:
One of the main problems in urban areas is the steady growth in car ownership and traffic levels. Therefore, the challenge of sustainability is focused on a shift of the demand for mobility from cars to collective means of transport. For this purpose, buses are a key element of the public transport systems. In this respect Real Time Passenger Information (RTPI) systems help people change their travel behaviour towards more sustainable transport modes. This paper provides an assessment methodology which evaluates how RTPI systems improve the quality of bus services performance in two European cities, Madrid and Bremerhaven. In the case of Madrid, bus punctuality has increased by 3%. Regarding the travellers perception, Madrid raised its quality of service by 6% while Bremerhaven increased by 13%. On the other hand, the users¿ perception of Public Transport (PT) image increased by 14%.
Resumo:
The inherent complexity of modern cloud infrastructures has created the need for innovative monitoring approaches, as state-of-the-art solutions used for other large-scale environments do not address specific cloud features. Although cloud monitoring is nowadays an active research field, a comprehensive study covering all its aspects has not been presented yet. This paper provides a deep insight into cloud monitoring. It proposes a unified cloud monitoring taxonomy, based on which it defines a layered cloud monitoring architecture. To illustrate it, we have implemented GMonE, a general-purpose cloud monitoring tool which covers all aspects of cloud monitoring by specifically addressing the needs of modern cloud infrastructures. Furthermore, we have evaluated the performance, scalability and overhead of GMonE with Yahoo Cloud Serving Benchmark (YCSB), by using the OpenNebula cloud middleware on the Grid’5000 experimental testbed. The results of this evaluation demonstrate the benefits of our approach, surpassing the monitoring performance and capabilities of cloud monitoring alternatives such as those present in state-of-the-art systems such as Amazon EC2 and OpenNebula.
Resumo:
Transmission errors are the main cause of degradation of the quality of real broadcasted video services. Therefore, knowing their impact on the quality of experience of the end users is a crucial issue. For instance, it would help to improve the performance of the distribution systems, and to develop monitoring tools to automatically estimate the quality perceived by the end users. In this paper we validate a subjective evaluation approach specifically designed to obtain meaningful results of the effects of degradations caused by transmission errors. This methodology has been already used in our previous works with monoscopic and stereoscopic videos. The validation is done by comparing the subjective ratings obtained for typical transmission errors with the proposed methodology and with the standard method Absolute Category Rating. The results show that the proposed approach could provide more representative evaluations of the quality of experience perceived by end users of conventional and 3D broadcasted video services.
Resumo:
In this paper we propose an innovative method for the automatic detection and tracking of road traffic signs using an onboard stereo camera. It involves a combination of monocular and stereo analysis strategies to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. Firstly, an adaptive color and appearance based detection is applied at single camera level to generate a set of traffic sign hypotheses. In turn, stereo information allows for sparse 3D reconstruction of potential traffic signs through a SURF-based matching strategy. Namely, the plane that best fits the cloud of 3D points traced back from feature matches is estimated using a RANSAC based approach to improve robustness to outliers. Temporal consistency of the 3D information is ensured through a Kalman-based tracking stage. This also allows for the generation of a predicted 3D traffic sign model, which is in turn used to enhance the previously mentioned color-based detector through a feedback loop, thus improving detection accuracy. The proposed solution has been tested with real sequences under several illumination conditions and in both urban areas and highways, achieving very high detection rates in challenging environments, including rapid motion and significant perspective distortion
Resumo:
BACKGROUND: Clinical Trials (CTs) are essential for bridging the gap between experimental research on new drugs and their clinical application. Just like CTs for traditional drugs and biologics have helped accelerate the translation of biomedical findings into medical practice, CTs for nanodrugs and nanodevices could advance novel nanomaterials as agents for diagnosis and therapy. Although there is publicly available information about nanomedicine-related CTs, the online archiving of this information is carried out without adhering to criteria that discriminate between studies involving nanomaterials or nanotechnology-based processes (nano), and CTs that do not involve nanotechnology (non-nano). Finding out whether nanodrugs and nanodevices were involved in a study from CT summaries alone is a challenging task. At the time of writing, CTs archived in the well-known online registry ClinicalTrials.gov are not easily told apart as to whether they are nano or non-nano CTs-even when performed by domain experts, due to the lack of both a common definition for nanotechnology and of standards for reporting nanomedical experiments and results. METHODS: We propose a supervised learning approach for classifying CT summaries from ClinicalTrials.gov according to whether they fall into the nano or the non-nano categories. Our method involves several stages: i) extraction and manual annotation of CTs as nano vs. non-nano, ii) pre-processing and automatic classification, and iii) performance evaluation using several state-of-the-art classifiers under different transformations of the original dataset. RESULTS AND CONCLUSIONS: The performance of the best automated classifier closely matches that of experts (AUC over 0.95), suggesting that it is feasible to automatically detect the presence of nanotechnology products in CT summaries with a high degree of accuracy. This can significantly speed up the process of finding whether reports on ClinicalTrials.gov might be relevant to a particular nanoparticle or nanodevice, which is essential to discover any precedents for nanotoxicity events or advantages for targeted drug therapy.
Resumo:
Topology control is an important technique to improve the connectivity and the reliability of Wireless Sensor Networks (WSNs) by means of adjusting the communication range of wireless sensor nodes. In this paper, a novel Fuzzy-logic Topology Control (FTC) is proposed to achieve any desired average node degree by adaptively changing communication range, thus improving the network connectivity, which is the main target of FTC. FTC is a fully localized control algorithm, and does not rely on location information of neighbors. Instead of designing membership functions and if-then rules for fuzzy-logic controller, FTC is constructed from the training data set to facilitate the design process. FTC is proved to be accurate, stable and has short settling time. In order to compare it with other representative localized algorithms (NONE, FLSS, k-Neighbor and LTRT), FTC is evaluated through extensive simulations. The simulation results show that: firstly, similar to k-Neighbor algorithm, FTC is the best to achieve the desired average node degree as node density varies; secondly, FTC is comparable to FLSS and k-Neighbor in terms of energy-efficiency, but is better than LTRT and NONE; thirdly, FTC has the lowest average maximum communication range than other algorithms, which indicates that the most energy-consuming node in the network consumes the lowest power.
Resumo:
This study suggests a theoretical framework for improving the teaching/ learning process of English employed in the Aeronautical discourse that brings together cognitive learning strategies, Genre Analysis and the Contemporary theory of Metaphor (Lakoff and Johnson 1980; Lakoff 1993). It maintains that cognitive strategies such as imagery, deduction, inference and grouping can be enhanced by means of metaphor and genre awareness in the context of content based approach to language learning. A list of image metaphors and conceptual metaphors which comes from the terminological database METACITEC is provided. The metaphorical terms from the area of Aeronautics have been taken from specialised dictionaries and have been categorised according to the conceptual metaphors they respond to, by establishing the source domains and the target domains, as well as the semantic networks found. This information makes reference to the internal mappings underlying the discourse of aeronautics reflected in five aviation accident case studies which are related to accident reports from the National Transportation Safety Board (NTSB) and provides an important source for designing language teaching tasks. La Lingüística Cognitiva y el Análisis del Género han contribuido a la mejora de la enseñanza de segundas lenguas y, en particular, al desarrollo de la competencia lingüística de los alumnos de inglés para fines específicos. Este trabajo pretende perfeccionar los procesos de enseñanza y el aprendizaje del lenguaje empleado en el discurso aeronáutico por medio de la práctica de estrategias cognitivas y prestando atención a la Teoría del análisis del género y a la Teoría contemporánea de la metáfora (Lakoff y Johnson 1980; Lakoff 1993). Con el propósito de crear recursos didácticos en los que se apliquen estrategias metafóricas, se ha elaborado un listado de metáforas de imagen y de metáforas conceptuales proveniente de la base de datos terminológica META-CITEC. Estos términos se han clasificado de acuerdo con las metáforas conceptuales y de imagen existentes en esta área de conocimiento. Para la enseñanza de este lenguaje de especialidad, se proponen las correspondencias y las proyecciones entre el dominio origen y el dominio meta que se han hallado en los informes de accidentes aéreos tomados de la Junta federal de la Seguridad en el Transporte (NTSB)
Resumo:
Data-related properties of the activities involved in a service composition can be used to facilitate several design-time and run-time adaptation tasks, such as service evolution, distributed enactment, and instance-level adaptation. A number of these properties can be expressed using a notion of sharing. We present an approach for automated inference of data properties based on sharing analysis, which is able to handle service compositions with complex control structures, involving loops and sub-workflows. The properties inferred can include data dependencies, information content, domain-defined attributes, privacy or confidentiality levels, among others. The analysis produces characterizations of the data and the activities in the composition in terms of minimal and maximal sharing, which can then be used to verify compliance of potential adaptation actions, or as supporting information in their generation. This sharing analysis approach can be used both at design time and at run time. In the latter case, the results of analysis can be refined using the composition traces (execution logs) at the point of execution, in order to support run-time adaptation.