868 resultados para Distributed mobility management
Resumo:
Access control is a fundamental concern in any system that manages resources, e.g., operating systems, file systems, databases and communications systems. The problem we address is how to specify, enforce, and implement access control in distributed environments. This problem occurs in many applications such as management of distributed project resources, e-newspaper and payTV subscription services. Starting from an access relation between users and resources, we derive a user hierarchy, a resource hierarchy, and a unified hierarchy. The unified hierarchy is then used to specify the access relation in a way that is compact and that allows efficient queries. It is also used in cryptographic schemes that enforce the access relation. We introduce three specific cryptography based hierarchical schemes, which can effectively enforce and implement access control and are designed for distributed environments because they do not need the presence of a central authority (except perhaps for set- UP).
Resumo:
In the mid-1820s, Anglo-American fur trappers, known as "mountain men," entered Arizona and began trapping beaver (Castor canadensis). In Arizona there have been a number of famous mountain men such as Sylvester and James Pattie, Ewing Young, Jededia Smith, and Bill Williams who trapped along the waterways in northern and southern Arizona. Although the heyday of mountain men lasted only a few decades due to a population decline of beaver, management of these animals continues to this day. The purpose of managing beavers shifted from monetary gain to controlling wildlife damage. During the late 1900s, beaver were still widely distributed in limited numbers throughout much of the state. We provide a historical overview of beaver management in Arizona with emphasis on the mountain men, recreational trapping, wildlife damage management, and beaver research in Arizona.
Resumo:
This NebGuide describes the life cycle of the army cutworm and pale western cutworm, and provides recommendations for management.The army cutworm, Euxoa auxiliaris, and the pale western cutworm, Agrotis orthogonia, are sporadic pests that are distributed throughout the Great Plains. The army cutworm can be found throughout Nebraska, but is more common in the western half of the state. Because of the drier environment, the pale western cutworm is found only in the western third of Nebraska. Both cutworms can feed on a vast array of crops and weeds. Their major economic impact is limited to winter wheat and alfalfa, because these are the vulnerable crops growing in the early spring when larval feeding activity occurs. However, they can also cause substantial damage to early spring row crops (sugarbeets and corn), especially in areas where winter cereal cover crops are used.
Resumo:
Judo competitions are divided into weight classes. However, most athletes reduce their body weight in a few days before competition in order to obtain a competitive advantage over lighter opponents. To achieve fast weight reduction, athletes use a number of aggressive nutritional strategies so many of them place themselves at a high health-injury risk. In collegiate wrestling, a similar problem has been observed and three wrestlers died in 1997 due to rapid weight loss regimes. After these deaths, the National Collegiate Athletic Association had implemented a successful weight management program which was proven to improve weight management behavior. No similar program has ever been discussed by judo federations even though judo competitors present a comparable inappropriate pattern of weight control. In view of this, the basis for a weight control program is provided in this manuscript, as follows: competition should begin within 1 hour after weigh-in, at the latest; each athlete is allowed to be weighed-in only once; rapid weight loss as well as artificial rehydration (i.e., saline infusion) methods are prohibited during the entire competition day; athletes should pass the hydration test to get their weigh-in validated; an individual minimum competitive weight (male athletes competing at no less than 7% and females at no less than 12% of body fat) should be determined at the beginning of each season; athletes are not allowed to compete in any weight class that requires weight reductions greater than 1.5% of body weight per week. In parallel, educational programs should aim at increasing the athletes', coaches' and parents' awareness about the risks of aggressive nutritional strategies as well as healthier ways to properly manage body weight.
Resumo:
DKA is a severe metabolic derangement characterized by dehydration, loss of electrolytes, hyperglycemia, hyperketonemia, acidosis and progressive loss of consciousness that results from severe insulin deficiency combined with the effects of increased levels of counterregulatory hormones (catecholamines, glucagon, cortisol, growth hormone). The biochemical criteria for diagnosis are: blood glucose > 200 mg/dl, venous pH <7.3 or bicarbonate <15 mEq/L, ketonemia >3 mmol/L and presence of ketonuria. A patient with DKA must be managed in an emergency ward by an experienced staff or in an intensive care unit (ICU), in order to provide an intensive monitoring of the vital and neurological signs, and of the patient's clinical and biochemical response to treatment. DKA treatment guidelines include: restoration of circulating volume and electrolyte replacement; correction of insulin deficiency aiming at the resolution of metabolic acidosis and ketosis; reduction of risk of cerebral edema; avoidance of other complications of therapy (hypoglycemia, hypokalemia, hyperkalemia, hyperchloremic acidosis); identification and treatment of precipitating events. In Brazil, there are few pediatric ICU beds in public hospitals, so an alternative protocol was designed to abbreviate the time on intravenous infusion lines in order to facilitate DKA management in general emergency wards. The main differences between this protocol and the international guidelines are: intravenous fluid will be stopped when oral fluids are well tolerated and total deficit will be replaced orally; if potassium analysis still indicate need for replacement, it will be given orally; subcutaneous rapid-acting insulin analog is administered at 0.15 U/kg dose every 2-3 hours until resolution of metabolic acidosis; approximately 12 hours after treatment initiation, intermediate-acting (NPH) insulin is initiated at the dose of 0.6-1 U/kg/day, and it will be lowered to 0.4-0.7 U/kg/day at discharge from hospital.
Resumo:
Abstract Introduction Several studies have shown that maximizing stroke volume (or increasing it until a plateau is reached) by volume loading during high-risk surgery may improve post-operative outcome. This goal could be achieved simply by minimizing the variation in arterial pulse pressure (ΔPP) induced by mechanical ventilation. We tested this hypothesis in a prospective, randomized, single-centre study. The primary endpoint was the length of postoperative stay in hospital. Methods Thirty-three patients undergoing high-risk surgery were randomized either to a control group (group C, n = 16) or to an intervention group (group I, n = 17). In group I, ΔPP was continuously monitored during surgery by a multiparameter bedside monitor and minimized to 10% or less by volume loading. Results Both groups were comparable in terms of demographic data, American Society of Anesthesiology score, type, and duration of surgery. During surgery, group I received more fluid than group C (4,618 ± 1,557 versus 1,694 ± 705 ml (mean ± SD), P < 0.0001), and ΔPP decreased from 22 ± 75 to 9 ± 1% (P < 0.05) in group I. The median duration of postoperative stay in hospital (7 versus 17 days, P < 0.01) was lower in group I than in group C. The number of postoperative complications per patient (1.4 ± 2.1 versus 3.9 ± 2.8, P < 0.05), as well as the median duration of mechanical ventilation (1 versus 5 days, P < 0.05) and stay in the intensive care unit (3 versus 9 days, P < 0.01) was also lower in group I. Conclusion Monitoring and minimizing ΔPP by volume loading during high-risk surgery improves postoperative outcome and decreases the length of stay in hospital. Trial registration NCT00479011
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Background Falling in older age is a major public health concern due to its costly and disabling consequences. However very few randomised controlled trials (RCTs) have been conducted in developing countries, in which population ageing is expected to be particularly substantial in coming years. This article describes the design of an RCT to evaluate the effectiveness of a multifactorial falls prevention program in reducing the rate of falls in community-dwelling older people. Methods/design Multicentre parallel-group RCT involving 612 community-dwelling men and women aged 60 years and over, who have fallen at least once in the previous year. Participants will be recruited in multiple settings in Sao Paulo, Brazil and will be randomly allocated to a control group or an intervention group. The usual care control group will undergo a fall risk factor assessment and be referred to their clinicians with the risk assessment report so that individual modifiable risk factors can be managed without any specific guidance. The intervention group will receive a 12-week Multifactorial Falls Prevention Program consisting of: an individualised medical management of modifiable risk factors, a group-based, supervised balance training exercise program plus an unsupervised home-based exercise program, an educational/behavioral intervention. Both groups will receive a leaflet containing general information about fall prevention strategies. Primary outcome measures will be the rate of falls and the proportion of fallers recorded by monthly falls diaries and telephone calls over a 12 month period. Secondary outcomes measures will include risk of falling, fall-related self-efficacy score, measures of balance, mobility and strength, fall-related health services use and independence with daily tasks. Data will be analysed using the intention-to-treat principle.The incidence of falls in the intervention and control groups will be calculated and compared using negative binomial regression analysis. Discussion This study is the first trial to be conducted in Brazil to evaluate the effectiveness of an intervention to prevent falls. If proven to reduce falls this study has the potential to benefit older adults and assist health care practitioners and policy makers to implement and promote effective falls prevention interventions. Trial registration ClinicalTrials.gov (NCT01698580)
Resumo:
Abstract Background This article aims to discuss the incorporation of traditional time in the construction of a management scenario for pink shrimp in the Patos Lagoon estuary (RS), Brazil. To meet this objective, two procedures have been adopted; one at a conceptual level and another at a methodological level. At the conceptual level, the concept of traditional time as a form of traditional ecological knowledge (TEK) was adopted. Method At the methodological level, we conduct a wide literature review of the scientific knowledge (SK) that guides recommendations for pink shrimp management by restricting the fishing season in the Patos Lagoon estuary; in addition, we review the ethno-scientific literature which describes traditional calendars as a management base for artisanal fishers in the Patos Lagoon estuary. Results Results demonstrate that TEK and SK describe similar estuarine biological processes, but are incommensurable at a resource management level. On the other hand, the construction of a “management scenario” for pink shrimp is possible through the development of “criteria for hierarchies of validity” which arise from a productive dialog between SK and TEK. Conclusions The commensurable and the incommensurable levels reveal different basis of time-space perceptions between traditional ecological knowledge and scientific knowledge. Despite incommensurability at the management level, it is possible to establish guidelines for the construction of “management scenarios” and to support a co-management process.
Resumo:
Abstract Background The Brazilian Study on the Practice of Diabetes Care main objective was to provide an epidemiological profile of individuals with type 1 and 2 diabetes mellitus (DM) in Brazil, concerning therapy and adherence to international guidelines in the medical practice. Methods This observational, cross-sectional, multicenter study collected and analyzed data from individuals with type 1 and 2 DM attending public or private clinics in Brazil. Each investigator included the first 10 patients with type 2 DM who visited his/her office, and the first 5 patients with type 1 DM. Results A total of 1,358 patients were analyzed; 375 (27.6%) had type 1 and 983 (72.4%) had type 2 DM. Most individuals were women, Caucasian, and private health care users. High prevalence rates of hypertension, dyslipidemia and central obesity were observed, particularly in type 2 DM. Only 7.3% and 5.1% of the individuals with types 1 and 2 DM, respectively, had optimal control of blood pressure, plasma glucose and lipids. The absence of hypertension and female sex were associated with better control of type 1 DM and other cardiovascular risk factors. In type 2 DM, older age was also associated with better control. Conclusions Female sex, older age, and absence of hypertension were associated with better metabolic control. An optimal control of plasma glucose and other cardiovascular risk factors are obtained only in a minority of individuals with diabetes. Local numbers, compared to those from other countries are worse.
Resumo:
Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.
Resumo:
Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
A Smart City is a high-performance urban context, where citizens live independently and are more aware of the surrounding opportunities, thanks to forward-looking development of economy politics, governance, mobility and environment. ICT infrastructures play a key-role in this new research field being also a mean for society to allow new ideas to prosper and new, more efficient approaches to be developed. The aim of this work is to research and develop novel solutions, here called smart services, in order to solve several upcoming problems and known issues in urban areas and more in general in the modern society context. A specific focus is posed on smart governance and on privacy issues which have been arisen in the cellular age.
Resumo:
In the last 10 years the number of mobile devices has grown rapidly. Each person usually brings at least two personal devices and researchers says that in a near future this number could raise up to ten devices per person. Moreover, all the devices are becoming more integrated to our life than in the past, therefore the amount of data exchanged increases accordingly to the improvement of people's lifestyle. This is what researchers call Internet of Things. Thus, in the future there will be more than 60 billions of nodes and the current infrastructure is not ready to keep track of all the exchanges of data between them. Therefore, infrastructure improvements have been proposed in the last years, like MobileIP and HIP in order to facilitate the exchange of packets in mobility, however none of them have been optimized for the purpose. In the last years, researchers from Mid Sweden University created The MediaSense Framework. Initially, this framework was based on the Chord protocol in order to route packets in a big network, but the most important change has been the introduction of PGrids in order to create the Overlay and the persistence. Thanks to this technology, a lookup in the trie takes up to 0.5*log(N), where N is the total number of nodes in the network. This result could be improved by further optimizations on the management of the nodes, for example by the dynamic creation of groups of nodes. Moreover, since the nodes move, an underlaying support for connectivity management is needed. SCTP has been selected as one of the most promising upcoming standards for simultaneous multiple connection's management.