911 resultados para time-to-rehospitalization


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Internet of Things (IoT) consists of a worldwide “network of networks,” composed by billions of interconnected heterogeneous devices denoted as things or “Smart Objects” (SOs). Significant research efforts have been dedicated to port the experience gained in the design of the Internet to the IoT, with the goal of maximizing interoperability, using the Internet Protocol (IP) and designing specific protocols like the Constrained Application Protocol (CoAP), which have been widely accepted as drivers for the effective evolution of the IoT. This first wave of standardization can be considered successfully concluded and we can assume that communication with and between SOs is no longer an issue. At this time, to favor the widespread adoption of the IoT, it is crucial to provide mechanisms that facilitate IoT data management and the development of services enabling a real interaction with things. Several reference IoT scenarios have real-time or predictable latency requirements, dealing with billions of device collecting and sending an enormous quantity of data. These features create a new need for architectures specifically designed to handle this scenario, hear denoted as “Big Stream”. In this thesis a new Big Stream Listener-based Graph architecture is proposed. Another important step, is to build more applications around the Web model, bringing about the Web of Things (WoT). As several IoT testbeds have been focused on evaluating lower-layer communication aspects, this thesis proposes a new WoT Testbed aiming at allowing developers to work with a high level of abstraction, without worrying about low-level details. Finally, an innovative SOs-driven User Interface (UI) generation paradigm for mobile applications in heterogeneous IoT networks is proposed, to simplify interactions between users and things.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Liposomes have been imaged using a plethora of techniques. However, few of these methods offer the ability to study these systems in their natural hydrated state without the requirement of drying, staining, and fixation of the vesicles. However, the ability to image a liposome in its hydrated state is the ideal scenario for visualization of these dynamic lipid structures and environmental scanning electron microscopy (ESEM), with its ability to image wet systems without prior sample preparation, offers potential advantages to the above methods. In our studies, we have used ESEM to not only investigate the morphology of liposomes and niosomes but also to dynamically follow the changes in structure of lipid films and liposome suspensions as water condenses on to or evaporates from the sample. In particular, changes in liposome morphology were studied using ESEM in real time to investigate the resistance of liposomes to coalescence during dehydration thereby providing an alternative assay of liposome formulation and stability. Based on this protocol, we have also studied niosome-based systems and cationic liposome/DNA complexes. Copyright © Informa Healthcare.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The suitability of a new plastic supporting medium for biofiltration was tested over a three year period. Tests were carried out on the stability, surface properties, mechanical strength, and dimensions of the medium. There was no evidence to suggest that the medium was deficient in any of these respects. The specific surface (320m2m-3) and the voidage (94%) of the new medium are unlike any other used in bio-filtration and a pilot plant containing two filters was built to observe its effects on ecology and performance. Performance was estimated by chemical analysis and ecology studied by film examination and fauna counts. A system of removable sampling baskets was designed to enable samples to be obtained from two intermediate depths of filter. One of the major operating problems of percolating filters is excessive accumulation of film. The amount of film is influenced by hydraulic and organic load and each filter was run at a different loading. One was operated at 1.2m3m-3day-1 (DOD load 0.24kgm-3day-1) judged at the time to be the lowest filtration rate to offer advantages over conventional media. The other filter was operated at more than twice this loading (2.4m3m-3day-lBOD load 0.55kgm-3day-1) giving a roughly 2.5x and 6x the conventional loadings recommended for a Royal Commission effluent. The amount of film in each filter was normally low (0.05-3kgm(3 as volatile solids) and did not affect efficiency. The evidence collected during the study indicated that the ecology of the filters was normal when compared with the data obtained from the literature relating to filters with mineral media. There were indications that full ecological stability was yet to be reached and this was affecting the efficiency of the filters. The lower rate filter produced an average 87% BOD removal giving a consistent Royal Commission effluent during the summer months. The higher rate filter produced a mean 83% BOD removal but at no stage a consistent Royal Commission effluent. From the data on ecology and performance the filters resembled conventional filters rather than high rate filters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of this thesis is to examine the experience of time of four professional occupational groups working in public sector organisations and the factors affecting this experience. The literature on time and work is examined to delineate the key parameters of research in this area. A broad organisation behaviour approach to the experience of time and work is developed in which individual, occupational, organisational and socio-political factors are inter-related. The experience of secondary school teachers, further education lecturers, general medical practitioners and hosoital consultants is then examined. Multiple methods of data collection are used: open-ended interviews, a questionnaire survey and the analysis of key documents relating to the institutional settings in which the four groups work. The research aims to develop our knowledge of working time by considering the dimensions of the experience of time at work, the contexts in wlhich this experience is generated and the constraints these contexts give rIse to. By developing our understanding of time as a key feature of work experience we also extend our knowledge of organisation behaviour in general. In conclusion a model of the factors relating the experience of time to the negotiation of time at work is presented.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This aim of this paper, from a study funded by the National Council for Graduate Entrepreneurship (NCGE), is to explore access to finance for ethnic minority graduate entrepreneurs (EMGEs) with a particular focus on comparisons between different ethnic groups, and men and women. The authors interviewed selected individuals based upon a review of literature on finance for ethnic minority enterprise. A number of key results from the survey, in that EMGEs: • use external finance significantly (more so than non graduates) and encounter barriers in accessing finance at start-up, in particular those belonging to poor families. • rely excessively on personal savings and family finance, at the start-up and long after the start-up stage, that has implications for the optimal capital structure. • start up businesses that are, on average, larger than non-graduate enterprises and have the potential to reduce economic inactivity amongst the ethnic population. • have, in contrast to general graduate start-ups, a high level of unemployment, take a longer period of time to enter employment and there is a higher level of dissatisfaction with career progression. These findings raise the question whether the right financial advice is taken and whether this behaviour constrains EMGEs' expansion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Self-adaptation enables software systems to respond to changing environmental contexts that may not be fully understood at design time. Designing a dynamically adaptive system (DAS) to cope with this uncertainty is challenging, as it is impractical during requirements analysis and design time to anticipate every environmental condition that the DAS may encounter. Previously, the RELAX language was proposed to make requirements more tolerant to environmental uncertainty, and Claims were applied as markers of uncertainty that document how design assumptions affect goals. This paper integrates these two techniques in order to assess the validity of Claims at run time while tolerating minor and unanticipated environmental conditions that can trigger adaptations. We apply the proposed approach to the dynamic reconfiguration of a remote data mirroring network that must diffuse data while minimizing costs and exposure to data loss. Results show RELAXing Claims enables a DAS to reduce adaptation costs. © 2012 Springer-Verlag.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Food allergy is associated with psychological distress in both child and parent. It is unknown whether parental distress is present prior to clinical diagnosis or whether experiences at clinic can reduce any distress present. This study aimed to assess anxiety and depression in parents and the impact of suspected food allergy on the lives of families before and after a visit to an allergy clinic. Methods: One hundred and twenty-four parents visiting an allergy clinic for the first time to have their child assessed for food allergy completed a study-specific questionnaire and the Hospital Anxiety and Depression Scale; 50 parents completed these 4-6 wk later in their own home. Results: Most parents (86.4%) reported suspected food allergy had an impact on their family life prior to clinic attendance; 76% had made changes to their child's diet. 32.5% of parents had mild-to-severe anxiety before their clinic visit; 17.5% had mild-to-moderate depression. Post-clinic, 40% had mild-to-severe anxiety; 13.1% had mild-to-moderate depression. There were no significant differences in anxiety (p = 0.34) or depression scores (p = 0.09) before and after the clinic visit. Conclusions: Anxiety and depression is present in a small proportion of parents prior to diagnosis of food allergy in their child and this does not reduce in the short term after the clinic visit. Identification of parents at risk of suffering from distress is needed and ways in which we communicate allergy information before and at clinic should be investigated to see if we can reduce distress. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective - We tested the hypothesis that patients with difficult asthma have an increased frequency of certain genotypes that predispose them to asthma exacerbations and poor asthma control. Methods - A total of 180 Caucasian children with confirmed asthma diagnosis were selected from two phenotypic groups; difficult (n = 112) versus mild/moderate asthma (n = 68) groups. All patients were screened for 19 polymorphisms in 9 candidate genes to evaluate their association with difficult asthma. Key Results - The results indicated that LTA4H A-9188>G, TNFα G-308>A and IL-4Rα A1727>G polymorphisms were significantly associated with the development of difficult asthma in paediatric patients (p<0.001, p = 0.019 and p = 0.037, respectively). Haplotype analysis also revealed two haplotypes (ATA haplotype of IL-4Rα A1199>C, IL-4Rα T1570>C and IL-4Rα A1727>G and CA haplotype of TNFα C-863>A and TNFα G-308>A polymorphisms) which were significantly associated with difficult asthma in children (p = 0.04 and p = 0.018, respectively). Conclusions and Clinical Relevance - The study revealed multiple SNPs and haplotypes in LTA4H, TNFα and IL4-Rα genes which constitute risk factors for the development of difficult asthma in children. Of particular interest is the LTA4H A-9188>G polymorphism which has been reported, for the first time, to have strong association with severe asthma in children. Our results suggest that screening for patients with this genetic marker could help characterise the heterogeneity of responses to leukotriene-modifying medications and, hence, facilitate targeting these therapies to the subset of patients who are most likely to gain benefit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: Short product life cycle and/or mass customization necessitate reconfiguration of operational enablers of supply chain (SC) from time to time in order to harness high levels of performance. The purpose of this paper is to identify the key operational enablers under stochastic environment on which practitioner should focus while reconfiguring a SC network. Design/methodology/approach: The paper used interpretive structural modeling (ISM) approach that presents a hierarchy-based model and the mutual relationships among the enablers. The contextual relationship needed for developing structural self-interaction matrix (SSIM) among various enablers is realized by conducting experiments through simulation of a hypothetical SC network. Findings: The research identifies various operational enablers having a high driving power towards assumed performance measures. In this regard, these enablers require maximum attention and of strategic importance while reconfiguring SC. Practical implications: ISM provides a useful tool to the SC managers to strategically adopt and focus on the key enablers which have comparatively greater potential in enhancing the SC performance under given operational settings. Originality/value: The present research realizes the importance of SC flexibility under the premise of reconfiguration of the operational units in order to harness high value of SC performance. Given the resulting digraph through ISM, the decision maker can focus the key enablers for effective reconfiguration. The study is one of the first efforts that develop contextual relations among operational enablers for SSIM matrix through integration of discrete event simulation to ISM. © Emerald Group Publishing Limited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is a great pleasure to be Guest Editor for this issue – I hope that the papers which are included will be stimulating and support you in your ongoing research activities. A number of guiding principles were adopted in selecting the papers for inclusion in this issue. Firstly, the papers cover a wide range of logistics and supply chain management (SCM) topics. This is a reflection of the evolution of the field in recent years. In terms of the “buy-make-store-move-sell” model of SCM all the main constituent areas are addressed. Secondly, it is important that the conference issue of this Journal reflects the emphasis and content of the conference itself. I have tried to achieve this in terms of the papers included. One interesting point to note is that outsourcing is a theme which is a major issue in a number of papers. This reflects the increasing importance of this issue to organisations of all kinds and sizes. Economic globalisation and the trend towards vertical disintegration of supply chain architectures have sharpened the focus on outsourcing as a key element of supply chain strategy. The need to move beyond the notion that sourcing of certain activities can be some kind of panacea in evident from the relevant contributions. Thirdly, the LRN Annual Conference has become a more international event in recent years...the number of delegates and papers presented from outside the UK has continued to grow. The papers collected in this issue reflect this internationalization. Two papers are worthy of particular comment from an LRN perspective. The contribution by Jaafar and Rafiq has been developed from the submission which won the best paper prize at the LRN 2004 event. The paper by Pettit and Beresford is based on research which was supported by LRN seed corn funding. It was developed form the final report on this work submitted to CITL (UK) via the LRN. The seed corn funding is an important mechanism whereby the LRN supports research in innovative aspects of logistics in UK universities. In many ways, the LRN2004 event in Dublin seems like a long time ago. From my point of view it was one of the most professionally rewarding activities in which I have been involved in my career. It was a time to meet old friends and new and to keep abreast of the multitude of interesting projects being undertaken in over 20 countries. There are too many people to thank for the smooth running of the event. However, my colleague John Mee does warrant a special mention. His logistical skills were seriously put to the test in the weeks and months leading up to September 9th. 2004. I want to acknowledge his particular contribution to the success of the event. Since then we have had the 2005 event at the University of Plymouth. This was again a great opportunity to network with colleagues and many congratulations are due to John Dinwoodie and his team. We now look forward to LRN 2006 in Newcastle...form my part I hope and trust that this issue provides some useful perspectives and insights into the range of topics addressed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Although fiber Bragg gratings (FBGs) have been widely used as advanced optical sensors, the cross-sensitivity between temperature and strain has complicated independent measurement procedures for these two measurands. We report here, for the first time to our knowledge, the results of a systematic investigation of the dependence of both temperature and strain sensitivities on the grating type, including the well-known Type I, Type IIA, and a new type which we have designated Type IA, using both hydrogen-free and hydrogenated B/Ge codoped fibers. We have identified distinct sensitivity characteristics for each grating type, and we have utilised them to implement a novel dual-grating, dual-parameter sensor device with performance superior to that of previously reported grating-based structures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We consider the problems of finding two optimal triangulations of a convex polygon: MaxMin area and MinMax area. These are the triangulations that maximize the area of the smallest area triangle in a triangulation, and respectively minimize the area of the largest area triangle in a triangulation, over all possible triangulations. The problem was originally solved by Klincsek by dynamic programming in cubic time [2]. Later, Keil and Vassilev devised an algorithm that runs in O(n^2 log n) time [1]. In this paper we describe new geometric findings on the structure of MaxMin and MinMax Area triangulations of convex polygons in two dimensions and their algorithmic implications. We improve the algorithm’s running time to quadratic for large classes of convex polygons. We also present experimental results on MaxMin area triangulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As an indicator of global change and shifting balances of power, every September in Dalian, China, the World Economic Forum meets. The subject in 2011 – Mastering Quality Growth. On the agenda is pursuing new frontiers of growth linked to embracing disruptive innovation. With growth coming from emerging markets, and European and North American economies treading water, many firms in the West are facing the reality of having to not just downsize but actually close manufacturing operations and re-open them elsewhere, where costs are lower, to remain competitive. There are thousands of books on “change management”. Yet very few of these devote much time to downsizing preferring to talk about re-engineering or restructuring. What lessons are available from the past to achieve a positive outcome from what will inevitably be something of a human, as well as an economic, tragedy. The authors reached three fundamental conclusions from their experience and research in facility closure management within Vauxhall, UK: put your people first, make sure you keep running the business and manage your legacy. They devlop the ideas into a new business model linked to the emotions of change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A heuristic for batching orders in a manual order-picking warehouse has been developed. It prioritizes orders based on due time to prevent mixing of orders of different priority levels. The order density of aisles criterion is used to form batches. It also determines the number of pickers required and assigns batches to pickers such that there is a uniform workload per unit of time. The effectiveness of the heuristic was studied by observing computational time and aisle congestion for various numbers of total orders and number of orders that form a batch. An initial heuristic performed well for small number of orders, but for larger number of orders, a partitioning technique is computationally more efficient, needing only minutes to solve for thousands of orders, while preserving 90% of the batch quality obtained with the original heuristic. Comparative studies between the heuristic and other published heuristics are needed. ^