942 resultados para requests
Resumo:
This paper introduces the evaluation report after fostering a Standard-based Interoperability Framework (SIF) between the Virgen del Rocío University Hospital (VRUH) Haemodialysis (HD) Unit and 5 outsourced HD centres in order to improve integrated care by automatically sharing patients' Electronic Health Record (EHR) and lab test reports. A pre-post study was conducted during fourteen months. The number of lab test reports of both emergency and routine nature regarding to 379 outpatients was computed before and after the integration of the SIF. Before fostering SIF, 19.38 lab tests per patient were shared between VRUH and HD centres, 5.52 of them were of emergency nature while 13.85 were routine. After integrating SIF, 17.98 lab tests per patient were shared, 3.82 of them were of emergency nature while 14.16 were routine. The inclusion of a SIF in the HD Integrated Care Process has led to an average reduction of 1.39 (p=0.775) lab test requests per patient, including a reduction of 1.70 (p=0.084) in those of emergency nature, whereas an increase of 0.31 (p=0.062) was observed in routine lab tests. Fostering this strategy has led to the reduction in emergency lab test requests, which implies a potential improvement of the integrated care.
Resumo:
Lung cancer screening has been the focus of intense interest since the publication in 2011 of the NLST trial (National Lung Screening Trial) showing a mortality reduction in smokers undergoing 3-year screening by chest computed tomography. Although these data appear promising, many issues remain to be resolved, such as high rate of false positive cases, risk of overdiagnosis, optimal intervals between screens, duration of the screening process, feasibility, and cost. Structured screening programs appear crucial to guarantee patient information, technical quality, and multidisciplinary management. Despite these uncertainties, several guidelines already state that screening should be performed in patients at risk, whereas investigators stress that more data are needed. How should the primary care physician deal with individual patients requests? This review provides some clues on this complex issue.
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
OBJECTIVE: To assess age- and nationality-specific trends in abortion rates over the last decade, and to describe women's characteristics, identifying risk factors for repeated abortion. METHODS: From 1990-1999, the Health Department of Canton Vaud (Switzerland) received 13'857 abortion requests from residents aged 14-49. Population data were obtained to compute rates. RESULTS: Both the number of abortions (1400 annually) as well as their rate (8.9 per thousand women [95% confidence interval (CI) 7.3-10.5]) were stable over the decade in question. The rate of abortion for foreign women, especially from ex-Yugoslavia and Africa, was twice that for Swiss women. Half of the requests came from single women, 43% had a low education level, and half were childless. The main reason for requesting termination of pregnancy was psychosocial (93%). The mean gestational age was 7.7 weeks (SD +/- 2.3), but 96% of requests were submitted before 12 weeks. Sixty-three percent of women reported that they had used no contraception, 36% the condom and 17% the pill. Among requests, the adjusted risk of repeated abortion (22% of abortion candidates) was greater among divorced/separated/widowed women (odds ratio [OR] 1.9 [95% CI 1.5-2.4]), unemployed women (OR 1.8 [95% CI 1.5-2.1]), and those who had not attended university (OR 1.6 [95% CI 1.1-2.2]). CONCLUSIONS: Although Swiss law only permitted abortion under strict conditions, this procedure was widely available in Vaud, which nevertheless has one of the lowest rates worldwide. Efforts must be intensified to ensure universal access to family planning services, especially for foreign women and adolescents. Professionals should also target "repeaters" to provide personalised counselling.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
Report for the scientific sojourn carried out at the Institut National d'Histoire de l'Art (INHA), France, from 2010 to 2012. It has focused on the analysis and editing of tales of human apparitions from the other world belonging to the Catalan culture or referring to it. We have studied and edited different versions of the process of Esperança Alegre (Lleida, 1500) and the Peregrinació del Venturós Pelegrí. These medieval works have been preserved in sources of the late sixteenth century or later. We have located a manuscript of the Esperança Alegre's tale, unknown to us at the beginning of this research (Biblioteca Nacional de España, ms. 1701), which differs from the version of ms. Baluze 238 of the Bibliothèque Nationale de France. The scribe of the ms. 1701 adds several paragraphs where considers the case as a diabolical phantasmagoria. About the Venturós Pelegrí, we have tried to establish firm criteria for the classification of many editions from the seventeenth to nineteenth centuries. We have been looking for printed books in the libràries of the world and we have made several requests for photographic reproductions, in order to classify undated editions by comparing woodcuts and other decorative elements. In the legend of Prince Charles of Viana (1421-1461), the appearances of his ghost are accompanied by rumors of his poisoning and of his sanctity. In addition, we have studied the cycles of masses for the souls in Purgatory linked to the hagiographies of St. Amadour and St. Vincent Ferrer, as well as the appearances described in L'Ànima d’Oliver of Francesc Moner, in the Carmelite chronicles of Father John of St. Joseph (1642-1718) and in some folktales collected from the eighteenth to the twentieth century. All this have allowed us to verify the evolution of certain cultural paradigms since the Middle Ages to the present.
Resumo:
S’ha implementat un servei VO (Virtual Observatori) a les instal lacions del Telescopi TFRM, que permet distribuir les imatges preses amb el telescopi de manera remota i automàtica a qualsevol usuari del servei. El servei està format per un arxiu d’imatges, una aplicació que integra les imatges a l'arxiu y una aplicació que es comunica amb els clients d’VO, rebent peticions i responen segons s’especifica al protocol SIAP (Simple Image Access Protocol).
Resumo:
Objectives Medical futility at the end of life is a growing challenge to medicine. The goals of the authors were to elucidate how clinicians define futility, when they perceive life-sustaining treatment (LST) to be futile, how they communicate this situation and why LST is sometimes continued despite being recognised as futile. Methods The authors reviewed ethics case consultation protocols and conducted semi-structured interviews with 18 physicians and 11 nurses from adult intensive and palliative care units at a tertiary hospital in Germany. The transcripts were subjected to qualitative content analysis. Results Futility was identified in the majority of case consultations. Interviewees associated futility with the failure to achieve goals of care that offer a benefit to the patient's quality of life and are proportionate to the risks, harms and costs. Prototypic examples mentioned are situations of irreversible dependence on LST, advanced metastatic malignancies and extensive brain injury. Participants agreed that futility should be assessed by physicians after consultation with the care team. Intensivists favoured an indirect and stepwise disclosure of the prognosis. Palliative care clinicians focused on a candid and empathetic information strategy. The reasons for continuing futile LST are primarily emotional, such as guilt, grief, fear of legal consequences and concerns about the family's reaction. Other obstacles are organisational routines, insufficient legal and palliative knowledge and treatment requests by patients or families. Conclusion Managing futility could be improved by communication training, knowledge transfer, organisational improvements and emotional and ethical support systems. The authors propose an algorithm for end-of-life decision making focusing on goals of treatment.
Resumo:
In this paper we present the theoretical and methodologicalfoundations for the development of a multi-agentSelective Dissemination of Information (SDI) servicemodel that applies Semantic Web technologies for specializeddigital libraries. These technologies make possibleachieving more efficient information management,improving agent–user communication processes, andfacilitating accurate access to relevant resources. Othertools used are fuzzy linguistic modelling techniques(which make possible easing the interaction betweenusers and system) and natural language processing(NLP) techniques for semiautomatic thesaurus generation.Also, RSS feeds are used as “current awareness bulletins”to generate personalized bibliographic alerts.
Resumo:
Public library statistics are taken from the annual survey. The statistics are used at the local, regional, state, and national levels to compare library performance, justify budget requests, track library data over time, assist in planning and evaluation, and provide valuable information for grants and other library programs. The annual survey collects current information from 543 public libraries about public service outlets, holdings, staffing, income, expenditures, circulation, services, and hours open. Furthermore, it helps provide a total picture of libraries on a state and nationwide basis. This report is authorized by law (Iowa Code 256.51 (H)). Each of the 50 states collects public library information according to guidelines established by the Federal State Cooperative System for public library data (FSCS). The information contained in the Iowa Public Library Statistics is based on definitions approved by FSCS. For additional information, contact Gerry Rowland, State Library, gerry.rowland@lib.state.ia.us; 1-800-248-4483.
Resumo:
Public library statistics are taken from the annual survey. The statistics are used at the local, regional, state, and national levels to compare library performance, justify budget requests, track library data over time, assist in planning and evaluation, and provide valuable information for grants and other library programs. The annual survey collects current information from 543 public libraries about public service outlets, holdings, staffing, income, expenditures, circulation, services, and hours open. Furthermore, it helps provide a total picture of libraries on a state and nationwide basis. This report is authorized by law (Iowa Code 256.51 (H)). Each of the 50 states collects public library information according to guidelines established by the Federal State Cooperative System for public library data (FSCS). The information contained in the Iowa Public Library Statistics is based on definitions approved by FSCS. For additional information, contact Gerry Rowland, State Library, gerry.rowland@lib.state.ia.us; 1-800-248-4483.
Resumo:
Abstract OBJECTIVE Understanding the conceptions of premature children caregivers on child development and associated factors. METHOD An exploratory-descriptive qualitative study of 12 families with children under three years of age. Interviews were submitted to thematic content analysis, systematized into the categories of Bioecological Theory of Human Development: Process, Person, Context and Time, and in the Functional Development category. RESULTS There are concerns about impairment in the current and future development of a Person/child defined as fragile as a result of premature birth (Time dimension), minimized by the scope of observable competencies such as motor skills. The Context, especially family and health services, and Proximal Processes, described as one-way caregiver interactions, are considered determinants of development. Functional Development is considered a natural consequence and result of education. The support network is crucial, supporting or limiting care. CONCLUSION Concerns about the development mobilize caregivers to stimulate the premature child/person and requests family and healthcare assistance.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
The Network Revenue Management problem can be formulated as a stochastic dynamic programming problem (DP or the\optimal" solution V *) whose exact solution is computationally intractable. Consequently, a number of heuristics have been proposed in the literature, the most popular of which are the deterministic linear programming (DLP) model, and a simulation based method, the randomized linear programming (RLP) model. Both methods give upper bounds on the optimal solution value (DLP and PHLP respectively). These bounds are used to provide control values that can be used in practice to make accept/deny decisions for booking requests. Recently Adelman [1] and Topaloglu [18] have proposed alternate upper bounds, the affine relaxation (AR) bound and the Lagrangian relaxation (LR) bound respectively, and showed that their bounds are tighter than the DLP bound. Tight bounds are of great interest as it appears from empirical studies and practical experience that models that give tighter bounds also lead to better controls (better in the sense that they lead to more revenue). In this paper we give tightened versions of three bounds, calling themsAR (strong Affine Relaxation), sLR (strong Lagrangian Relaxation) and sPHLP (strong Perfect Hindsight LP), and show relations between them. Speciffically, we show that the sPHLP bound is tighter than sLR bound and sAR bound is tighter than the LR bound. The techniques for deriving the sLR and sPHLP bounds can potentially be applied to other instances of weakly-coupled dynamic programming.
Resumo:
In this paper we address the issue of locating hierarchical facilities in the presence of congestion. Two hierarchical models are presented, where lower level servers attend requests first, and then, some of the served customers are referred to higher level servers. In the first model, the objective is to find the minimum number of servers and theirlocations that will cover a given region with a distance or time standard. The second model is cast as a Maximal Covering Location formulation. A heuristic procedure is then presented together with computational experience. Finally, some extensions of these models that address other types of spatial configurations are offered.