870 resultados para social event detection
Resumo:
Background Rapid Response Systems (RRS) consist of four interrelated and interdependent components; an event detection and trigger mechanism, a response strategy, a governance structure and process improvement system. These multiple components of the RRS pose problems in evaluation as the intervention is complex and cannot be evaluated using a traditional systematic review. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different approach to summarising the evidence. Realist synthesis is such an approach to reviewing research evidence on complex interventions to provide an explanatory analysis of how and why an intervention works or doesn’t work in practice. The core principle is to make explicit the underlying assumptions about how an intervention is suppose to work (ie programme theory) and then use this theory to guide evaluation. Methods A realist synthesis process was used to explain those factors that enable or constrain the success of RRS programmes. Results The findings from the review include the articulation of the RRS programme theories, evaluation of whether these theories are supported or refuted by the research evidence and an evaluation of evidence to explain the underlying reasons why RRS works or doesn’t work in practice. Rival conjectured RRS programme theories were identified to explain the constraining factors regarding implementation of RRS in practice. These programme theories are presented using a logic model to highlight all the components which impact or influence the delivery of RRS programmes in the practice setting. The evidence from the realist synthesis provided the foundation for the development of hypothesis to test and refine the theories in the subsequent stages of the Realist Evaluation PhD study [1]. This information will be useful in providing evidence and direction for strategic and service planning of acute care to improve patient safety in hospital. References: McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) Realistic Evaluation of Early Warning Systems and the Acute Life-threatening Events – Recognition and Treatment training course for early recognition and management of deteriorating ward-based patients: research protocol. Journal of Advanced Nursing 66 (4), 923-932.
Resumo:
Without human beings, and human activities, hazards can strike but disasters cannot occur, they are not just natural phenomena but a social event (Van Der Zon, 2005). The rapid demand for reconstruction after disastrous events can result in the impacts of projects not being carefully considered from the outset and the opportunity to improve long-term physical and social community structures being neglected. The events that struck Banda Aceh in 2004 have been described as
a story of ‘two tsunamis’, the first being the natural hazard that struck and the second being the destruction of social structures that occurred as a result of unplanned, unregulated and uncoordinated response (Syukrizal et al, 2009). Measures must be in place to ensure that, while aiming to meet reconstruction
needs as rapidly as possible, the risk of re-occurring disaster impacts are reduced through both the physical structures and the capacity of the community who inhabit them. The paper explores issues facing reconstruction in a post-disaster scenario, drawing on the connections between physical and social reconstruction in order to address long term recovery solutions. It draws on a study of relevant literature and a six week pilot study spent in Haiti exploring the progress of recovery in the Haitian capital and the limitations still restricting reconstruction efforts. The study highlights the need for recovery management strategies that recognise the link between social and physical reconstruction and the significance of community based initiatives that see local residents driving recovery in terms of debris handling and rebuilding. It demonstrates how a community driven approach to physical reconstruction could also address the social impacts of events that, in the case of places such as Haiti, are still dramatically restricting recovery efforts.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
The IEEE 802.15.4 is the most widespread used protocol for Wireless Sensor Networks (WSNs) and it is being used as a baseline for several higher layer protocols such as ZigBee, 6LoWPAN or WirelessHART. Its MAC (Medium Access Control) supports both contention-free (CFP, based on the reservation of guaranteed time-slots GTS) and contention based (CAP, ruled by CSMA/CA) access, when operating in beacon-enabled mode. Thus, it enables the differentiation between real-time and best-effort traffic. However, some WSN applications and higher layer protocols may strongly benefit from the possibility of supporting more traffic classes. This happens, for instance, for dense WSNs used in time-sensitive industrial applications. In this context, we propose to differentiate traffic classes within the CAP, enabling lower transmission delays and higher success probability to timecritical messages, such as for event detection, GTS reservation and network management. Building upon a previously proposed methodology (TRADIF), in this paper we outline its implementation and experimental validation over a real-time operating system. Importantly, TRADIF is fully backward compatible with the IEEE 802.15.4 standard, enabling to create different traffic classes just by tuning some MAC parameters.
Resumo:
In 1903, the Canadian Association of Amateur Oarsmen had their request granted to make the Old Welland Canal at Port Dalhousie the permanent site of the Royal Canadian Henley Regatta. That same year organized rowing was established in St. Catharines when the St. Catharines Rowing and Canoe Club was formed. The Henley course was completed in July of 1903 after rowing was well underway. Although the Henley course served as an athletic and social event, rowing itself was slow to grow in the St. Catharines area. In 1915 the Regatta was cancelled for the duration of WWI and reinstated in 1919 when an increased public interest in the sport began to grow. Two years later, the Henley Aquatic Association was formed in order to control, maintain and improve the rowing facilities. This association was responsible for building a new clubhouse at Ann Street in 1921 and in 1931 completing the grandstands. Also in the 1930s the association had the Federal Government approve their appeal to have the Henley waters dredged for the first time. The St. Catharines Rowing Club re-located its headquarters to the Lakeport Road site. The 1940s brought more support from local groups and with that more events. In 1945, the St. Catharines Junior Chamber of Commerce began helping to organize and promote rowing locally. One of the new events at the Henley course was the "Schoolboy Championships". The growth of both rowing and the Henley continued growing through the 1950s. The Henley Aquatic Association acquired Reid's Island, now Henley Island, mainly through the efforts of Ted Nelson. In the 1960s, rowing really took off in St. Catharines. Women began to become recognized in the sport when Brock University created a women's rowing team. The second dredging was completed in 1964, leading to the creation of a world class rowing course. The facilities were upgraded to international standards and the Henley rowing course became Canada's first Class A FISA (Fédération Internationale des Sociétés d’Aviron or International Federation of Rowing Associations) rowing course. The first North American Rowing Championship was held at the Henley course in 1967 and again in 1970 for the third championship. The Canadian Henley Rowing Corporation formed in 1972, along with the St. Catharines, Parks and Recreation Department created the first rowing school for youth. Since 1960, St. Catharines has been at a competitive level with other International rowing courses. The city continues to produce Olympic level athletes today.
Resumo:
La maladie cœliaque ou sprue cœliaque est une intolérance au gluten. Il s’agit d’une maladie inflammatoire de l’intestin liée à l’ingestion de gluten chez des personnes génétiquement susceptibles. Ce désordre présente une forte prévalence puisqu’il touche 1 % de la population mondiale. En l’état actuel des choses, il n’existe aucun outil pharmacologique pour traiter ou pallier à cette maladie. Cependant, grâce aux avancées dans la compréhension de sa pathogenèse, de nouvelles cibles thérapeutiques ont été identifiées. À l’heure actuelle, le seul traitement efficace consiste à suspendre la consommation de l’agent pathogène, à savoir le gluten. Le gluten est un ensemble de protéines de stockage des céréales contenu dans le blé, l’orge et le seigle. Le gluten du blé se subdivise en gluténines et gliadines. Ce sont ces dernières qui semblent les plus impliquées dans la maladie cœliaque. Les gliadines et ses protéines apparentées (i.e. sécalines et hordéines, respectivement dans le seigle et l’orge) sont riches en prolines et en glutamines, les rendant résistantes à la dégradation par les enzymes digestives et celles de la bordure en brosse. Les peptides résultant de cette digestion incomplète peuvent induire des réponses immunitaires acquises et innées. L’objectif principal de cette thèse était de tester un nouveau traitement d’appoint de la maladie cœliaque utile lors de voyages ou d’évènements ponctuels. Dans les années 80, une observation italienne montra l’inhibition de certains effets induits par des gliadines digérées sur des cultures cellulaires grâce à la co-incubation en présence de mannane: un polyoside naturel composé de mannoses. Malheureusement, ce traitement n’était pas applicable in vivo à cause de la dégradation par les enzymes du tractus gastro-intestinales du polymère, de par sa nature osidique. Les polymères de synthèse, grâce à la diversité et au contrôle de leurs propriétés physico-chimiques, se révèlent être une alternative attrayante à ce polymère naturel. L’objectif de cette recherche était d’obtenir un polymère liant la gliadine, capable d’interférer dans la genèse de la maladie au niveau du tube digestif, afin d’abolir les effets délétères induits par la protéine. Tout d’abord, des copolymères de type poly (hydroxyéthylméthacrylate)-co-(styrène sulfonate) (P(HEMA-co-SS)) ont été synthétisés par polymérisation radicalaire contrôlée par transfert d’atome (ATRP). Une petite bibliothèque de polymères a été préparée en faisant varier la masse molaire, ainsi que les proportions de chacun des monomères. Ces polymères ont ensuite été testés quant à leur capacité de complexer la gliadine aux pH stomacal et intestinal et les meilleurs candidats ont été retenus pour des essais cellulaires. Les travaux ont permis de montrer que le copolymère P(HEMA-co-SS) (45:55 mol%, 40 kDa) permettait une séquestration sélective de la gliadine et qu’il abolissait les effets induits par la gliadine sur différents types cellulaires. De plus, ce composé interférait avec la digestion de la gliadine, suggérant une diminution de peptides immunogènes impliqués dans la maladie. Ce candidat a été testé in vivo, sur un modèle murin sensible au gluten, quant à son efficacité vis-à-vis de la gliadine pure et d’un mélange contenant du gluten avec d’autres composants alimentaires. Le P(HEMA-co-SS) a permis de diminuer les effets sur les paramètres de perméabilité et d’inflammation, ainsi que de moduler la réponse immunitaire engendrée par l’administration de gliadine et celle du gluten. Des études de toxicité et de biodistribution en administration aigüe et chronique ont été réalisées afin de démontrer que ce dernier était bien toléré et peu absorbé suite à son administration par la voie orale. Enfin des études sur des échantillons de tissus de patients souffrants de maladie cœliaque ont montré un bénéfice therapeutique du polymère. L’ensemble des travaux présentés dans cette thèse a permis de mettre en évidence le potentiel thérapeutique du P(HEMA-co-SS) pour prévenir les désordres reliés à l’ingestion de gluten, indiquant que ce type de polymère pourrait être exploité dans un avenir proche.
Resumo:
Two vertical cosmic ray telescopes for atmospheric cosmic ray ionization event detection are compared. Counter A, designed for low power remote use, was deployed in the Welsh mountains; its event rate increased with altitude as expected from atmospheric cosmic ray absorption. Independently, Counter B’s event rate was found to vary with incoming particle acceptance angle. Simultaneous colocated comparison of both telescopes exposed to atmospheric ionization showed a linear relationship between their event rates.
Resumo:
This paper presents the two datasets (ARENA and P5) and the challenge that form a part of the PETS 2015 workshop. The datasets consist of scenarios recorded by us- ing multiple visual and thermal sensors. The scenarios in ARENA dataset involve different staged activities around a parked vehicle in a parking lot in UK and those in P5 dataset involve different staged activities around the perimeter of a nuclear power plant in Sweden. The scenarios of each dataset are grouped into ‘Normal’, ‘Warning’ and ‘Alarm’ categories. The Challenge specifically includes tasks that account for different steps in a video understanding system: Low-Level Video Analysis (object detection and tracking), Mid-Level Video Analysis (‘atomic’ event detection) and High-Level Video Analysis (‘complex’ event detection). The evaluation methodology used for the Challenge includes well-established measures.
Resumo:
This paper describes the dataset and vision challenges that form part of the PETS 2014 workshop. The datasets are multisensor sequences containing different activities around a parked vehicle in a parking lot. The dataset scenarios were filmed from multiple cameras mounted on the vehicle itself and involve multiple actors. In PETS2014 workshop, 22 acted scenarios are provided of abnormal behaviour around the parked vehicle. The aim in PETS 2014 is to provide a standard benchmark that indicates how detection, tracking, abnormality and behaviour analysis systems perform against a common database. The dataset specifically addresses several vision challenges corresponding to different steps in a video understanding system: Low-Level Video Analysis (object detection and tracking), Mid-Level Video Analysis (‘simple’ event detection: the behaviour recognition of a single actor) and High-Level Video Analysis (‘complex’ event detection: the behaviour and interaction recognition of several actors).
Resumo:
Synoptic wind events in the equatorial Pacific strongly influence the El Niño/Southern Oscillation (ENSO) evolution. This paper characterizes the spatio-temporal distribution of Easterly (EWEs) and Westerly Wind Events (WWEs) and quantifies their relationship with intraseasonal and interannual large-scale climate variability. We unambiguously demonstrate that the Madden–Julian Oscillation (MJO) and Convectively-coupled Rossby Waves (CRW) modulate both WWEs and EWEs occurrence probability. 86 % of WWEs occur within convective MJO and/or CRW phases and 83 % of EWEs occur within the suppressed phase of MJO and/or CRW. 41 % of WWEs and 26 % of EWEs are in particular associated with the combined occurrence of a CRW/MJO, far more than what would be expected from a random distribution (3 %). Wind events embedded within MJO phases also have a stronger impact on the ocean, due to a tendency to have a larger amplitude, zonal extent and longer duration. These findings are robust irrespective of the wind events and MJO/CRW detection methods. While WWEs and EWEs behave rather symmetrically with respect to MJO/CRW activity, the impact of ENSO on wind events is asymmetrical. The WWEs occurrence probability indeed increases when the warm pool is displaced eastward during El Niño events, an increase that can partly be related to interannual modulation of the MJO/CRW activity in the western Pacific. On the other hand, the EWEs modulation by ENSO is less robust, and strongly depends on the wind event detection method. The consequences of these results for ENSO predictability are discussed.
Resumo:
A política de indexação deve ser constituída de estratégias que permitam o alcance dos objetivos de recuperação do sistema de informação. O indexador tem a função primordial de compreender o documento ao realizar uma análise conceitual que represente adequadamente seu conteúdo. Utilizando a leitura como evento social/protocolo verbal em grupo, nosso objetivo é contribuir com a literatura sobre política de indexação e apresentar propostas de ensino de política de indexação direcionadas a alunos de graduação e pós-graduação, além de uma experiência de educação à distância com vistas à formação do bibliotecário em serviço. Os resultados obtidos demonstraram que a metodologia pode ser utilizada por sistemas de informação para que se tenha acesso ao conhecimento do indexador. Conclui que o indexador deve ser o alvo de investimento dos sistemas de informação e sugere aos sistemas de informação que a experiência do indexador também seja utilizada como parâmetro para política de indexação.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Public organizations today ar constantly developing relationship strategies with audiences in search of acceptance before the public. The objective of this wor is to present strategies of Governmental Public Relations and Political Marketing that can be implemented by communication professionals in government. Therefore, duscysses principles and instruments of the objects mentioned, conducting study on actions taken by the Municipality of Botucatu city during the anniversary year of 2012, with na emphasis on the social event, Food Court Solidarity. The instruments studied contribute to the strengthening of relations between government and citizens, while complying with its commitments under its plan, meeting the demands of the population
Resumo:
This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.