832 resultados para Policy-based management systems
Resumo:
This paper addresses the problems of effective in situ measurement of the real-time strain for bridge weigh in motion in reinforced concrete bridge structures through the use of optical fiber sensor systems. By undertaking a series of tests, coupled with dynamic loading, the performance of fiber Bragg grating-based sensor systems with various amplification techniques were investigated. In recent years, structural health monitoring (SHM) systems have been developed to monitor bridge deterioration, to assess load levels and hence extend bridge life and safety. Conventional SHM systems, based on measuring strain, can be used to improve knowledge of the bridge's capacity to resist loads but generally give no information on the causes of any increase in stresses. Therefore, it is necessary to find accurate sensors capable of capturing peak strains under dynamic load and suitable methods for attaching these strain sensors to existing and new bridge structures. Additionally, it is important to ensure accurate strain transfer between concrete and steel, adhesives layer, and strain sensor. The results show the benefits in the use of optical fiber networks under these circumstances and their ability to deliver data when conventional sensors cannot capture accurate strains and/or peak strains.
Resumo:
Social work in the United Kingdom remains embroiled in concerns about child protection error. The serious injury or death of vulnerable children continues to evince much consternation in the public and private spheres. Governmental responses to these concerns invariably draw on technocratic solutions involving more procedures, case management systems, information technology and bureaucratic regulation. Such solutions flow from an implicit use of instrumental rationality based on a ‘means-end’ logic. While bringing an important perspective to the problem of child protection error, instrumental rationality has been overused limiting discretion and other modes of rational inquiry. This paper argues that the social work profession should apply an enlarged form of rationality comprising not only the instrumental-rational mode but also the critical-rational, affective-rational and communicative-rational forms. It is suggested that this combined, conceptual arsenal of rational inquiry leads to a gestalt which has been termed the holistic-rational perspective. It is also argued that embracing a more rounded perspective such as this might offer greater opportunities for reducing child protection error.
Resumo:
Thermal comfort is defined as “that condition of mind which expresses satisfaction with the thermal environment’ [1] [2]. Field studies have been completed in order to establish the governing conditions for thermal comfort [3]. These studies showed that the internal climate of a room was the strongest factor in establishing thermal comfort. Direct manipulation of the internal climate is necessary to retain an acceptable level of thermal comfort. In order for Building Energy Management Systems (BEMS) strategies to be efficiently utilised it is necessary to have the ability to predict the effect that activating a heating/cooling source (radiators, windows and doors) will have on the room. The numerical modelling of the domain can be challenging due to necessity to capture temperature stratification and/or different heat sources (radiators, computers and human beings). Computational Fluid Dynamic (CFD) models are usually utilised for this function because they provide the level of details required. Although they provide the necessary level of accuracy these models tend to be highly computationally expensive especially when transient behaviour needs to be analysed. Consequently they cannot be integrated in BEMS. This paper presents and describes validation of a CFD-ROM method for real-time simulations of building thermal performance. The CFD-ROM method involves the automatic extraction and solution of reduced order models (ROMs) from validated CFD simulations. The test case used in this work is a room of the Environmental Research Institute (ERI) Building at the University College Cork (UCC). ROMs have shown that they are sufficiently accurate with a total error of less than 1% and successfully retain a satisfactory representation of the phenomena modelled. The number of zones in a ROM defines the size and complexity of that ROM. It has been observed that ROMs with a higher number of zones produce more accurate results. As each ROM has a time to solution of less than 20 seconds they can be integrated into the BEMS of a building which opens the potential to real time physics based building energy modelling.
Resumo:
Accurate modelling of the internal climate of buildings is essential if Building Energy Management Systems (BEMS) are to efficiently maintain adequate thermal comfort. Computational fluid dynamics (CFD) models are usually utilised to predict internal climate. Nevertheless CFD models, although providing the necessary level of accuracy, are highly computationally expensive, and cannot practically be integrated in BEMS. This paper presents and describes validation of a CFD-ROM method for real-time simulations of building thermal performance. The CFD-ROM method involves the automatic extraction and solution of reduced order models (ROMs) from validated CFD simulations. ROMs are shown to be adequately accurate with a total error below 5% and to retain satisfactory representation of the phenomena modelled. Each ROM has a time to solution under 20seconds, which opens the potential of their integration with BEMS, giving real-time physics-based building energy modelling. A parameter study was conducted to investigate the applicability of the extracted ROM to initial boundary conditions different from those from which it was extracted. The results show that the ROMs retained satisfactory total errors when the initial conditions in the room were varied by ±5°C. This allows the production of a finite number of ROMs with the ability to rapidly model many possible scenarios.
Resumo:
This paper proposes an efficient learning mechanism to build fuzzy rule-based systems through the construction of sparse least-squares support vector machines (LS-SVMs). In addition to the significantly reduced computational complexity in model training, the resultant LS-SVM-based fuzzy system is sparser while offers satisfactory generalization capability over unseen data. It is well known that the LS-SVMs have their computational advantage over conventional SVMs in the model training process; however, the model sparseness is lost, which is the main drawback of LS-SVMs. This is an open problem for the LS-SVMs. To tackle the nonsparseness issue, a new regression alternative to the Lagrangian solution for the LS-SVM is first presented. A novel efficient learning mechanism is then proposed in this paper to extract a sparse set of support vectors for generating fuzzy IF-THEN rules. This novel mechanism works in a stepwise subset selection manner, including a forward expansion phase and a backward exclusion phase in each selection step. The implementation of the algorithm is computationally very efficient due to the introduction of a few key techniques to avoid the matrix inverse operations to accelerate the training process. The computational efficiency is also confirmed by detailed computational complexity analysis. As a result, the proposed approach is not only able to achieve the sparseness of the resultant LS-SVM-based fuzzy systems but significantly reduces the amount of computational effort in model training as well. Three experimental examples are presented to demonstrate the effectiveness and efficiency of the proposed learning mechanism and the sparseness of the obtained LS-SVM-based fuzzy systems, in comparison with other SVM-based learning techniques.
Resumo:
The problem-Musculoskeletal (MSK) symptoms are common within primary care but some GPs are not comfortable managing these; waiting times for hospital appointments are a major cause of patients’ complaints. Current UK healthcare policies emphasise a need for more community-based management. We aimed to pilot an innovative general practice-based clinic to improve the management of MSK and Sport and Exercise Medicine (SEM) symptoms within general practice.
The approach-This project was conducted in an inner-city practice of approximately 9,000 patients and 5 GP partners. The practice commissioned a novel monthly 4-hour clinic staffed by one GP with a specialist interest in MSK and SEM conditions. Each patient was allocated a 20-minute appointment. All primary care staff within the practice could refer any patient for whom they considered hospital referral appropriate, with no specific exclusion criteria. Management plans included injection therapy, exercise prescription and onward referral. After three months (August-October 2014) numbers of consultations, sources of referral, reasons for referral and management outcomes were described; patient satisfaction was assessed by questionnaire, offered to 10 randomly selected patients by reception staff and self-completed by patients. Costs of the clinic were compared to current options.
Findings- All patients (14 males; 21 females; aged 35-77 years), were seen within four weeks of referral (one third of orthopaedic referrals in 2013 waited over 9 weeks for appointment). Most were referred from other GPs; some came from physiotherapy and podiatry. Shoulder problems were the most frequent reason for referral. The commonest management option was steroid injection, with most patients being given advice regarding exercise and analgesia; there were 3 onward referrals (2 physiotherapy; 1 rheumatology).
Comparing August-October data in 2014 and 2013, total, orthopaedic and rheumatology referrals were reduced by 147, 2 and 3, respectively; within the practice MSK presentations and physiotherapy and x-ray referrals were 60, 47 and 90 fewer, respectively.
The cost per attendance at the clinic was £61; initial orthopaedic-ICAT assessments cost £82 and a consultant appointment £213.
Satisfaction questionnaires were returned by all 10 selected participants and provided positive feedback, expressing preference for community-based, rather than hospital, management.
Consequence- Our pilot study indicates that this novel service model has potential for efficient and effective management of MSK and SEM complaints in primary care, reducing the need for hospital referral and the clinical burden on general practices. The innovation deserves further evaluation in a full-scale trial to determine its generalisability to other practice settings and populations.
Resumo:
Highway structures such as bridges are subject to continuous degradation primarily due to ageing, loading and environmental factors. A rational transport policy must monitor and provide adequate maintenance to this infrastructure to guarantee the required levels of transport service and safety. Increasingly in recent years, bridges are being instrumented and monitored on an ongoing basis due to the implementation of Bridge Management Systems. This is very effective and provides a high level of protection to the public and early warning if the bridge becomes unsafe. However, the process can be expensive and time consuming, requiring the installation of sensors and data acquisition electronics on the bridge. This paper investigates the use of an instrumented 2-axle vehicle fitted with accelerometers to monitor the dynamic behaviour of a bridge network in a simple and cost-effective manner. A simplified half car-beam interaction model is used to simulate the passage of a vehicle over a bridge. This investigation involves the frequency domain analysis of the axle accelerations as the vehicle crosses the bridge. The spectrum of the acceleration record contains noise, vehicle, bridge and road frequency components. Therefore, the bridge dynamic behaviour is monitored in simulations for both smooth and rough road surfaces. The vehicle mass and axle spacing are varied in simulations along with bridge structural damping in order to analyse the sensitivity of the vehicle accelerations to a change in bridge properties. These vehicle accelerations can be obtained for different periods of time and serve as a useful tool to monitor the variation of bridge frequency and damping with time.
Resumo:
We consider the problem of segmenting text documents that have a
two-part structure such as a problem part and a solution part. Documents
of this genre include incident reports that typically involve
description of events relating to a problem followed by those pertaining
to the solution that was tried. Segmenting such documents
into the component two parts would render them usable in knowledge
reuse frameworks such as Case-Based Reasoning. This segmentation
problem presents a hard case for traditional text segmentation
due to the lexical inter-relatedness of the segments. We develop
a two-part segmentation technique that can harness a corpus
of similar documents to model the behavior of the two segments
and their inter-relatedness using language models and translation
models respectively. In particular, we use separate language models
for the problem and solution segment types, whereas the interrelatedness
between segment types is modeled using an IBM Model
1 translation model. We model documents as being generated starting
from the problem part that comprises of words sampled from
the problem language model, followed by the solution part whose
words are sampled either from the solution language model or from
a translation model conditioned on the words already chosen in the
problem part. We show, through an extensive set of experiments on
real-world data, that our approach outperforms the state-of-the-art
text segmentation algorithms in the accuracy of segmentation, and
that such improved accuracy translates well to improved usability
in Case-based Reasoning systems. We also analyze the robustness
of our technique to varying amounts and types of noise and empirically
illustrate that our technique is quite noise tolerant, and
degrades gracefully with increasing amounts of noise
Resumo:
A evolução observada nas redes de comunicações durante a última década traduziu-se na diversificação de serviços que utilizam a rede, no aumento das taxas de transferência e na massificação da utilização de serviços de acesso à Internet e de comunicações celulares. Durante esta década, várias organizações, das quais se destacam os operadores de telecomunicações, têm dedicado consideráveis esforços no sentido de definir e normalizar arquitecturas de redes de próxima geração. A principal característica deste tipo de rede reside no facto de possuir uma arquitectura modular capaz de fornecer serviços multimédia a clientes de uma rede de acesso com características tecnológicas heterogéneas. Os trabalhos de normalização das arquitecturas de rede NGN têm-se limitado, até ao momento, a especificar detalhes relativos ao funcionamento da rede não tendo ainda sido definida a arquitectura de gestão. Em termos de tecnologias de gestão de redes, foram propostos nas últimas duas décadas novos paradigmas de gestão, novos modelos de dados, novos protocolos de transporte e várias linguagens de definição de informação de gestão. Os modelos de dados têm vindo a ser enriquecidos, os protocolos são mais flexíveis e poderosos, as soluções de gestão oferecem interoperabilidade acrescida e as linguagens permitem definir formatos de configuração mais ricos. Simultaneamente tem crescido a complexidade das soluções de gestão, aumentado a sobrecarga causada pelo aumento de complexidade nos equipamentos bem como nas plataformas computacionais que suportam os sistemas de gestão. O presente trabalho propõe uma solução de gestão para redes NGN capaz de gerir os recursos de rede garantindo Qualidade de Serviço. A solução de gestão proposta inclui uma plataforma de execução de políticas que utiliza os eventos ocorridos na rede para empreender acções de configuração, autonomizando o processo de gestão. Inclui uma avaliação da complexidade de várias tecnologias de gestão estudando a sobrecarga causada pela tecnologia tanto no processo de gestão como na operação da rede. É ainda estudada a escalabilidade das várias tecnologias e analisado o seu comportamento num cenário da rede de um operador de telecomunicações. O trabalho propõe ainda uma metodologia de configuração integrada dos elementos de gestão, através de uma interface de configuração amigável para o administrador do sistema.
Resumo:
O objetivo principal desta tese de doutoramento é o de identificar as tecnologias da comunicação que as Instituições de Ensino Superior Públicas Portuguesas usam no suporte à aprendizagem e caracterizar esse uso em relação à perspetiva institucional. O contexto e os objetivos específicos da investigação são apresentados, seguidos pela descrição do enquadramento conceptual, que inclui a revisão da literatura especializada e os grupos e projetos de investigação com trabalho mais relevante para este estudo. As questões de investigação são identificadas e discutidas, e é proposto um modelo de análise original baseado em dois conceitos principais - o conceito de enquadramento institucional e o conceito de uso, que serviu de base às diversas fases da investigação. Descreve-se e justifica-se a metodologia de investigação que foi adoptada, incluindo o questionário online usado na recolha de dados (Novembro de 2010 – Fevereiro de 2011), assim como os métodos adotados no processamento dos dados. A apresentação e discussão dos resultados concentra-se nas questões de investigação e na abordagem de investigação adotada. Os resultados mostram que as práticas tradicionais de disseminação de materiais e de comunicação entre os docentes e os alunos estão a migrar para o ambiente online através do uso das plataformas de gestão de aprendizagem e das tecnologias que suportam a comunicação interpessoal; e que o uso da Web 2.0 e dos ambientes 3D é limitado. Os resultados sugerem a necessidade de aprofundar a investigação sobre a formação de docentes, especialmente no uso efetivo das tecnologias da comunicação no suporte a novas abordagens às práticas de ensino e aprendizagem.
Resumo:
Apesar das recentes inovações tecnológicas, o setor dos transportes continua a exercer impactes significativos sobre a economia e o ambiente. Com efeito, o sucesso na redução das emissões neste setor tem sido inferior ao desejável. Isto deve-se a diferentes fatores como a dispersão urbana e a existência de diversos obstáculos à penetração no mercado de tecnologias mais limpas. Consequentemente, a estratégia “Europa 2020” evidencia a necessidade de melhorar a eficiência no uso das atuais infraestruturas rodoviárias. Neste contexto, surge como principal objetivo deste trabalho, a melhoria da compreensão de como uma escolha de rota adequada pode contribuir para a redução de emissões sob diferentes circunstâncias espaciais e temporais. Simultaneamente, pretende-se avaliar diferentes estratégias de gestão de tráfego, nomeadamente o seu potencial ao nível do desempenho e da eficiência energética e ambiental. A integração de métodos empíricos e analíticos para avaliação do impacto de diferentes estratégias de otimização de tráfego nas emissões de CO2 e de poluentes locais constitui uma das principais contribuições deste trabalho. Esta tese divide-se em duas componentes principais. A primeira, predominantemente empírica, baseou-se na utilização de veículos equipados com um dispositivo GPS data logger para recolha de dados de dinâmica de circulação necessários ao cálculo de emissões. Foram percorridos aproximadamente 13200 km em várias rotas com escalas e características distintas: área urbana (Aveiro), área metropolitana (Hampton Roads, VA) e um corredor interurbano (Porto-Aveiro). A segunda parte, predominantemente analítica, baseou-se na aplicação de uma plataforma integrada de simulação de tráfego e emissões. Com base nesta plataforma, foram desenvolvidas funções de desempenho associadas a vários segmentos das redes estudadas, que por sua vez foram aplicadas em modelos de alocação de tráfego. Os resultados de ambas as perspetivas demonstraram que o consumo de combustível e emissões podem ser significativamente minimizados através de escolhas apropriadas de rota e sistemas avançados de gestão de tráfego. Empiricamente demonstrou-se que a seleção de uma rota adequada pode contribuir para uma redução significativa de emissões. Foram identificadas reduções potenciais de emissões de CO2 até 25% e de poluentes locais até 60%. Através da aplicação de modelos de tráfego demonstrou-se que é possível reduzir significativamente os custos ambientais relacionados com o tráfego (até 30%), através da alteração da distribuição dos fluxos ao longo de um corredor com quatro rotas alternativas. Contudo, apesar dos resultados positivos relativamente ao potencial para a redução de emissões com base em seleções de rotas adequadas, foram identificadas algumas situações de compromisso e/ou condicionantes que devem ser consideradas em futuros sistemas de eco navegação. Entre essas condicionantes importa salientar que: i) a minimização de diferentes poluentes pode implicar diferentes estratégias de navegação, ii) a minimização da emissão de poluentes, frequentemente envolve a escolha de rotas urbanas (em áreas densamente povoadas), iii) para níveis mais elevados de penetração de dispositivos de eco-navegação, os impactos ambientais em todo o sistema podem ser maiores do que se os condutores fossem orientados por dispositivos tradicionais focados na minimização do tempo de viagem. Com este trabalho demonstrou-se que as estratégias de gestão de tráfego com o intuito da minimização das emissões de CO2 são compatíveis com a minimização do tempo de viagem. Por outro lado, a minimização de poluentes locais pode levar a um aumento considerável do tempo de viagem. No entanto, dada a tendência de redução nos fatores de emissão dos poluentes locais, é expectável que estes objetivos contraditórios tendam a ser minimizados a médio prazo. Afigura-se um elevado potencial de aplicação da metodologia desenvolvida, seja através da utilização de dispositivos móveis, sistemas de comunicação entre infraestruturas e veículos e outros sistemas avançados de gestão de tráfego.
Resumo:
Creativity is recognized nowadays as a basic skill. However, the educational system fails in promoting their development. On the other hand, a growing acknowledgement of the importance of geometry emerges. Conceptual renewal, namely on isometries, requires new approaches based on mathematically significant tasks. The digital revolution has brought powerful tools but demands changes in the educational process. The use of Dynamic Geometry Environments (DGE), complementing ‘paper and pencil’, can contribute to provide rich learning environments, enhanced by Classroom Management Systems (CMS) such as iTALC. Indeed, the qualitative case study we carried out suggests that: the creation of an "atmosphere" of cooperation, collaboration and sharing seems to increase creativity dimensions; the use of DGE can facilitate the emergence of more creative productions; development of knowledge and geometrical capabilities seems to benefit from a complementary approach that combines DGE and ‘paper and pencil’ environments. Different approaches, with a more technological and exploratory nature seem to promote more favourable attitudes towards mathematics in general, and geometry, in particular.
Resumo:
The Tourism activity, due to its own characteristics, causes high environmental impact and its development should be influenced by the environmental characteristics of each region. The Information Systems, mainly those which represent the geographical information, will permit the development of a design, allowing the Decision-Maker the consult, the management and the presentation of decision schemes based on the defined measures of the Tourism Planning of a region. As the information associated with this design should be real and update, the Internet should be used as a means of access to the information of the region. The design presents the schemes associated to each decision, offering competitive advantages to the Decision-Makers involved in the decision process, since it is possible to evaluate, foresee and control the future environmental impacts of Tourism.
Resumo:
The environment is one of the greatest concerns of humankind. Nowadays, the activities which improve or destroy it must be assessed and controlled by efficient means which should permit the control of environmental impact caused by the development of these activities. This document presents an information system implementation, as a Decision Support system, allowing the Decision Maker to evaluate, foresee and control the future environmental impact of Tourism through consultation, the management and the presentation of decision schemes based on defined measures of a regional tourism planning.
Resumo:
Joint master programmes are systems which by default demand a proper quality system in order to sustain and improve. Objective of this thesis is analysing and proposing solutions to difficulties associated with the implementation of a quality management system to joint master programmes, with the focus on international joint master programmes. The application of the analysis to the Erasmus Mundus joint master programme European Master in Quality in Analytical Laboratories (EMQAL) is discussed. QA systems implementation in HEIs in Europe is an ongoing process, and implementation of such systems in JPs is one step further to enhancing quality in higher education in Europe. The issue closely discussed in this thesis is: should QMS be developed independently from the institutions, or should the institutions, when developing their quality management systems, take into account the (future) development of joint courses and prepare their quality procedures accordingly? A quality management system is normally developed for one organization, and different aspects of cooperation are considered within. A joint master programme is a result of successful cooperation of two or more organizations; therefore a development of its quality management system must be approached in a different manner. This thesis proposes a QMS with emphasis both on the HEI and the consortium. Different processes in the QMS can be managed independently at the level of the HEI or at the level of the consortium. Most processes in joint master programmes should be designed in programmes’ and institutions’ QMSs. Quality of a joint master programme cannot be analyzed separately from the higher education institutions which are organizing it. Comparative analysis of organization of one Erasmus Mundus Master programme to the solutions proposed in discussion showed that from all of the aspects considered, processes in EMQAL are organized in harmony with the proposed delegation of processes of the QMS for a joint master programme. The solutions proposed in the discussion are based on theoretical application of the quality principles and concepts. Comparison to the quality processes and procedures in an existing EM programme showed that analysis is applicable in practice.