986 resultados para Ruelle-Takens scenario
Resumo:
Background With the emergence of influenza H1N1v the world is facing its first 21st century global pandemic. Severe Acute Respiratory Syndrome (SARS) and avian influenza H5N1 prompted development of pandemic preparedness plans. National systems of public health law are essential for public health stewardship and for the implementation of public health policy[1]. International coherence will contribute to effective regional and global responses. However little research has been undertaken on how law works as a tool for disease control in Europe. With co-funding from the European Union, we investigated the extent to which laws across Europe support or constrain pandemic preparedness planning, and whether national differences are likely to constrain control efforts. Methods We undertook a survey of national public health laws across 32 European states using a questionnaire designed around a disease scenario based on pandemic influenza. Questionnaire results were reviewed in workshops, analysing how differences between national laws might support or hinder regional responses to pandemic influenza. Respondents examined the impact of national laws on the movements of information, goods, services and people across borders in a time of pandemic, the capacity for surveillance, case detection, case management and community control, the deployment of strategies of prevention, containment, mitigation and recovery and the identification of commonalities and disconnects across states. Results Results of this study show differences across Europe in the extent to which national pandemic policy and pandemic plans have been integrated with public health laws. We found significant differences in legislation and in the legitimacy of strategic plans. States differ in the range and the nature of intervention measures authorized by law, the extent to which borders could be closed to movement of persons and goods during a pandemic, and access to healthcare of non-resident persons. Some states propose use of emergency powers that might potentially override human rights protections while other states propose to limit interventions to those authorized by public health laws. Conclusion These differences could create problems for European strategies if an evolving influenza pandemic results in more serious public health challenges or, indeed, if a novel disease other than influenza emerges with pandemic potential. There is insufficient understanding across Europe of the role and importance of law in pandemic planning. States need to build capacity in public health law to support disease prevention and control policies. Our research suggests that states would welcome further guidance from the EU on management of a pandemic, and guidance to assist in greater commonality of legal approaches across states.
Resumo:
L'acostament entre el professional de la salut i el qui necessita els seus serveis és un tema d'interès per a la psicologia de la salut des que va començar com a disciplina. En el context de la societat de la informació i el coneixement apareix un nou escenari d'intervenció d'aquests dos col·lectius que és necessari conèixer. Per això s'han començat diferents propostes, però la que presenta el grup d'investigació de Psicologia de la Salut i Xarxa (PSINET) de la Universitat Oberta de Catalunya s'encamina a potenciar la creació d'espais virtuals de trobada entre ambdós col·lectius (professionals de la salut i usuaris de serveis de salut). L'establiment de plataformes digitals de serveis sanitaris per als ciutadans del segle XXI passa primer per conèixer la realitat dels diferents col·lectius implicats en la relació salut i xarxa. L'objectiu que es planteja en aquest estudi se centra en el primer col·lectiu i en el descobriment del que hi ha sobre salut a Internet. Per a això, seguint una metodologia de recerca exhaustiva per Internet, s'han recollit webs sobre salut en català i castellà, i s'ha fet una anàlisi de dades textuals de la informació que contenien els webs en català. Aquesta anàlisi ha permès conèixer i descriure el prototip de web sobre salut que hi ha a la xarxa en el moment de fer l'estudi.
Resumo:
La migració internacional contemporània és integrada en un procés d'interconnexió global definit per les revolucions del transport i de les tecnologies de la informació i la comunicació. Una de les conseqüències d'aquesta interconnexió global és que les persones migrants tenen més capacitat per a processar informació tant abans com després de marxar. Aquests canvis podrien tenir implicacions inesperades per a la migració contemporània pel que fa a la capacitat de les persones migrants per a prendre decisions més informades, la reducció de la incertesa en contextos migratoris, el desdibuixament del concepte de distància o la decisió d'emigrar cap a llocs més llunyans. Aquesta recerca és important, ja que la manca de coneixement sobre aquesta qüestió podria contribuir a fer augmentar la distància entre els objectius de les polítiques de migració i els seus resultats. El paper que tenen els agents de la informació en els contextos migratoris també podria canviar. En aquest escenari, perquè les polítiques de migració siguin més efectives, s'haurà de tenir en compte la major capacitat de la població migrant de processar la informació i les fonts d'informació en què es confia. Aquest article demostra que l'equació més informació equival a més ben informat no es compleix sempre. Fins i tot en l'era de la informació, les fonts no fiables, les expectatives falses, la sobreinformació i els rumors encara són presents en els contextos migratoris. Tanmateix, defensem l'argument que aquests efectes no volguts es podrien reduir complint quatre requisits de la informació fiable: que sigui exhaustiva, que sigui rellevant, que s'hi confiï i que sigui actualitzada.
Resumo:
Balanced lethal systems are more than biological curiosities: as theory predicts, they should quickly be eliminated through the joint forces of recombination and selection. That such systems might become fixed in natural populations poses a challenge to evolutionary theory. Here we address the case of a balanced lethal system fixed in crested newts and related species, which makes 50% of offspring die early in development. All adults are heteromorphic for chromosome pair 1. The two homologues (1A and 1B) have different recessive deleterious alleles fixed on a nonrecombining segment, so that heterozygotes are viable, while homozygotes are lethal. Given such a strong segregation load, how could autosomes stop recombining? We propose a role for a sex-chromosome turnover from pair 1 (putative ancestral sex chromosome) to pair 4 (currently active sex chromosome). Accordingly, 1A and 1B represent two variants (Y(A) and Y(B)) of the Y chromosome from an ancestral male-heterogametic system. We formalize a scenario in which turnovers are driven by sex ratio selection stemming from gene-environment interactions on sex determination. Individual-based simulations show that a balanced lethal system can be fixed with significant likelihood, provided the masculinizing allele on chromosome 4 appears after the elimination of the feminizing allele on chromosome 1. Our study illustrates how strikingly maladaptive traits might evolve through natural selection.
Resumo:
One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.
Resumo:
Necrotising pneumonia in young, previously healthy patients due to Panton–Valentine leucocidin (PVL) producing Staphylococcus aureus has been increasingly recognised. PVL pneumonia is often associated with influenza co-infection and high mortality. This case report describes the successful management of the first documented paediatric case of a previous healthy adolescent who developed necrotising pneumonia due to community-acquired methicillin-resistant (CA-MRSA) clone USA300 with pandemic influenza A (H1N1) co-infection, and highlights the importance of early recognition and initiation of appropriate therapy for this potentially fatal co-infection. PCR remains the gold standard to diagnose pandemic H1N1 since it may not be detected by rapid antigen tests. Bacterial necrotising pneumonia should be suspected in those presenting with worsening flu-like symptoms and clinical and/or radiological evidence of PVL infection (multifocal infiltrates, effusion and cavitation). These patients may benefit from the administration of toxin neutralising agents. In light of the current H1N1 pandemic, healthcare professionals will be increasingly confronted with this clinical scenario.
Resumo:
Imaging plays a key role in lung infections. A CT scan must be carried out when there is a strong clinical suspicion of pneumonia that is accompanied by normal, ambiguous, or nonspecific radiography, a scenario that occurs most commonly in immunocompromised patients. CT allows clinicians to detect associated abnormalities or an underlying condition and it can guide bronchoalveolar lavage or a percutaneous or transbronchial lung biopsy. An organism can vary in how it is expressed depending on the extent to which the patient is immunocompromised. This is seen in tuberculosis in patients with AIDS. The infective agents vary with the type of immune deficiency and some infections can quickly become life-threatening. Clinicians should be aware of the complex radiological spectrum of pulmonary aspergillosis, given that this diagnosis must be considered in specific settings.
Resumo:
In order to successfully deploy multicast services in QoS-aware networks, pricing architectures must take into account the particular characteristics of multicast sessions. With this objective, we propose a charging scheme for QoS multicast services, assuming that the unicast cost of each interconnecting link is determined and that such cost is expressed in terms of quality of service (QoS) parameters. Our scheme allows determining the cost distribution of a multicast session along a cost distribution tree (CDT), and basing such distribution in those pre-existing unicast cost functions. The paper discusses in detail the main characteristics of the problem in a realistic interdomain scenario and how the proposed scheme would contribute to its solution
Resumo:
Intraspecific coalitional aggression between groups of individuals is a widespread trait in the animal world. It occurs in invertebrates and vertebrates, and is prevalent in humans. What are the conditions under which coalitional aggression evolves in natural populations? In this article, I develop a mathematical model delineating conditions where natural selection can favor the coevolution of belligerence and bravery between small-scale societies. Belligerence increases an actor's group probability of trying to conquer another group and bravery increase the actors's group probability of defeating an attacked group. The model takes into account two different types of demographic scenarios that may lead to the coevolution of belligerence and bravery. Under the first, the fitness benefits driving the coevolution of belligerence and bravery come through the repopulation of defeated groups by fission of victorious ones. Under the second demographic scenario, the fitness benefits come through a temporary increase in the local carrying capacity of victorious groups, after transfer of resources from defeated groups to victorious ones. The analysis of the model suggests that the selective pressures on belligerence and bravery are stronger when defeated groups can be repopulated by victorious ones. The analysis also suggests that, depending on the shape of the contest success function, costly bravery can evolve in groups of any size.
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
A survey of MPLS protection methods and their utilization in combination with online routing methods is presented in this article. Usually, fault management methods pre-establish backup paths to recover traffic after a failure. In addition, MPLS allows the creation of different backup types, and hence MPLS is a suitable method to support traffic-engineered networks. In this article, an introduction of several label switch path backup types and their pros and cons are pointed out. The creation of an LSP involves a routing phase, which should include QoS aspects. In a similar way, to achieve a reliable network the LSP backups must also be routed by a QoS routing method. When LSP creation requests arrive one by one (a dynamic network scenario), online routing methods are applied. The relationship between MPLS fault management and QoS online routing methods is unavoidable, in particular during the creation of LSP backups. Both aspects are discussed in this article. Several ideas on how these actual technologies could be applied together are presented and compared
Resumo:
Geochemical and petrographical studies of lavas and ignimbrites from the Quaternary Nisyros-Yali volcanic system in the easternmost part of the Hellenic arc (Greece) reveal insight into magma generating processes. A compositional gap between 61 and 68 wt.% SiO2 is recognized that coincides with the stratigraphic distinction between pre-caldera and postcaldera volcanic units. Trace element systematics support the subdivision of Nisyros and Yali volcanic units into two distinct suites of rocks. The variation of Nd and Hf present day isotope data and the fact that they are distinct from the isotope compositions of MORB rule out an origin by pure differentiation and require assimilation of a crustal component. Lead isotope ratios of Nisyros and Yali volcanic rocks support mixing of mantle material with a lower crust equivalent. However, Sr-87/Sr-86 ratios of 0.7036-0.7048 are incompatible with a simple binary mixing scenario and give low depleted mantle extraction ages (< 0.1 Ga), in contrast with Pb model ages of 0.3 Ga and Hf and Nd model ages of ca. 0.8 Ga. The budget of fluid-mobile elements Sr and Pb is likely to be dominated by abundant hydrous fluids characterised by mantle-like Sr isotope ratios. Late stage fluids probably were enriched in CO2, needed to explain the high Th concentrations. The occurrence of hydrated minerals (e.g., amphibole) in the first post-caldera unit with the lowermost Sr-87/Sr-86 ratio of 0.7036 +/- 2 can be interpreted as the result of the increased water activity in the source. The presence of two different plagioclase phenocryst generations in the first lava subsequent to the caldera-causing event is indicative for a longer storage time of this magma at a shallower level. A model capable of explaining these observations involves three evolutionary stages. First stage, assimilation of lower crustal material by a primitive magma of mantle origin (as modelled by Nd-Hf isotope systematics). This stage ended by an interruption in replenishment that led to an increase of crystallization and, hence, an increase in viscosity, suppressing eruption. During this time gap, differentiation by fractional crystallization led to enrichment of incompatible species, especially aqueous fluids, to silica depolymerisation and to a decrease in viscosity, finally enabling eruption again in the third stage. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio
Resumo:
This paper focuses on QoS routing with protection in an MPLS network over an optical layer. In this multi-layer scenario each layer deploys its own fault management methods. A partially protected optical layer is proposed and the rest of the network is protected at the MPLS layer. New protection schemes that avoid protection duplications are proposed. Moreover, this paper also introduces a new traffic classification based on the level of reliability. The failure impact is evaluated in terms of recovery time depending on the traffic class. The proposed schemes also include a novel variation of minimum interference routing and shared segment backup computation. A complete set of experiments proves that the proposed schemes are more efficient as compared to the previous ones, in terms of resources used to protect the network, failure impact and the request rejection ratio