951 resultados para Artificial aging and KNO3
Resumo:
Seriously ill infants often display protein-calorie malnutrition due to the metabolic demands of sepsis and respiratory failure. Glutamine has been classified as a conditionally essential amino acid, with special usefulness in critical patients. Immunomodulation, gut protection, and prevention of protein depletion are mentioned among its positive effects in such circumstances. With the intent of evaluating the tolerance and clinical impact of a glutamine supplement in seriously ill infants, a prospective randomized study was done with nine patients. Anthropometric and biochemical determinations were made, and length of stay in the intensive care unit (ICU), in the hospital, and under artificial ventilation, and septic morbidity and mortality were tabulated. Infants in the treatment group (n=5) were enterally administered 0.3 g/kg of glutamine, whereas controls received 0.3 g/kg of casein during a standard period of five days. Septic complications occurred in 75% of the controls (3/4) versus 20% of the glutamine-treated group (1/5, p<=0.10), and two patients in the control group died of bacterial infections (50% vs. 0%, p<=0.10). Days in the ICU, in the hospital, and with ventilation numerically favored glutamine therapy, although without statistical significance. The supplements were usually well tolerated, and no patient required discontinuation of the program. The conclusion was that glutamine supplementation was safe and tended to be associated with less infectious morbidity and mortality in this high-risk population.
Resumo:
This article describes the main approaches adopted in a study focused on planning industrial estates on a sub-regional scale. The study was supported by an agent-based model, using firms as agents to assess the attractiveness of industrial estates. The simulation was made by the NetLogo toolkit and the environment represents a geographical space. Three scenarios and four hypotheses were used in the simulation to test the impact of different policies on the attractiveness of industrial estates. Policies were distinguished by the level of municipal coordination at which they were implemented and by the type of intervention. In the model, the attractiveness of industrial estates was based on the level of facilities, amenities, accessibility and on the price of land in each industrial estate. Firms are able to move and relocate whenever they find an attractive estate. The relocating firms were selected by their size, location and distance to an industrial estate. Results show that a coordinated policy among municipalities is the most efficient policy to promote advanced-qualified estates. In these scenarios, it was observed that more industrial estates became attractive, more firms were relocated and more vacant lots were occupied. Furthermore, the results also indicate that the promotion of widespread industrial estates with poor-quality infrastructures and amenities is an inefficient policy to attract firms.
Resumo:
RoboCup was created in 1996 by a group of Japanese, American, and European Artificial Intelligence and Robotics researchers with a formidable, visionary long-term challenge: “By 2050 a team of robot soccer players will beat the human World Cup champion team.” At that time, in the mid 90s, when there were very few effective mobile robots and the Honda P2 humanoid robot was presented to a stunning public for the first time also in 1996, the RoboCup challenge, set as an adversarial game between teams of autonomous robots, was fascinating and exciting. RoboCup enthusiastically and concretely introduced three robot soccer leagues, namely “Simulation,” “Small-Size,” and “Middle-Size,” as we explain below, and organized its first competitions at IJCAI’97 in Nagoya with a surprising number of 100 participants [RC97]. It was the beginning of what became a continously growing research community. RoboCup established itself as a structured organization (the RoboCup Federation www.RoboCup.org). RoboCup fosters annual competition events, where the scientific challenges faced by the researchers are addressed in a setting that is attractive also to the general public. and the RoboCup events are the ones most popular and attended in the research fields of AI and Robotics.RoboCup further includes a technical symposium with contributions relevant to the RoboCup competitions and beyond to the general AI and robotics.
Resumo:
Earthworks tasks are often regarded in transportation projects as some of the most demanding processes. In fact, sequential tasks such as excavation, transportation, spreading and compaction are strongly based on heavy mechanical equipment and repetitive processes, thus becoming as economically demanding as they are time-consuming. Moreover, actual construction requirements originate higher demands for productivity and safety in earthwork constructions. Given the percentual weight of costs and duration of earthworks in infrastructure construction, the optimal usage of every resource in these tasks is paramount. Considering the characteristics of an earthwork construction, it can be looked at as a production line based on resources (mechanical equipment) and dependency relations between sequential tasks, hence being susceptible to optimization. Up to the present, the steady development of Information Technology areas, such as databases, artificial intelligence and operations research, has resulted in the emergence of several technologies with potential application bearing that purpose in mind. Among these, modern optimization methods (also known as metaheuristics), such as evolutionary computation, have the potential to find high quality optimal solutions with a reasonable use of computational resources. In this context, this work describes an optimization algorithm for earthworks equipment allocation based on a modern optimization approach, which takes advantage of the concept that an earthwork construction can be regarded as a production line.
Resumo:
This research aims to advance blinking detection in the context of work activity. Rather than patients having to attend a clinic, blinking videos can be acquired in a work environment, and further automatically analyzed. Therefore, this paper presents a methodology to perform the automatic detection of eye blink using consumer videos acquired with low-cost web cameras. This methodology includes the detection of the face and eyes of the recorded person, and then it analyzes the low-level features of the eye region to create a quantitative vector. Finally, this vector is classified into one of the two categories considered —open and closed eyes— by using machine learning algorithms. The effectiveness of the proposed methodology was demonstrated since it provides unbiased results with classification errors under 5%
Resumo:
Text Mining has opened a vast array of possibilities concerning automatic information retrieval from large amounts of text documents. A variety of themes and types of documents can be easily analyzed. More complex features such as those used in Forensic Linguistics can gather deeper understanding from the documents, making possible performing di cult tasks such as author identi cation. In this work we explore the capabilities of simpler Text Mining approaches to author identification of unstructured documents, in particular the ability to distinguish poetic works from two of Fernando Pessoas' heteronyms: Alvaro de Campos and Ricardo Reis. Several processing options were tested and accuracies of 97% were reached, which encourage further developments.
Resumo:
Large scale distributed data stores rely on optimistic replication to scale and remain highly available in the face of net work partitions. Managing data without coordination results in eventually consistent data stores that allow for concurrent data updates. These systems often use anti-entropy mechanisms (like Merkle Trees) to detect and repair divergent data versions across nodes. However, in practice hash-based data structures are too expensive for large amounts of data and create too many false conflicts. Another aspect of eventual consistency is detecting write conflicts. Logical clocks are often used to track data causality, necessary to detect causally concurrent writes on the same key. However, there is a nonnegligible metadata overhead per key, which also keeps growing with time, proportional with the node churn rate. Another challenge is deleting keys while respecting causality: while the values can be deleted, perkey metadata cannot be permanently removed without coordination. Weintroduceanewcausalitymanagementframeworkforeventuallyconsistentdatastores,thatleveragesnodelogicalclocks(BitmappedVersion Vectors) and a new key logical clock (Dotted Causal Container) to provides advantages on multiple fronts: 1) a new efficient and lightweight anti-entropy mechanism; 2) greatly reduced per-key causality metadata size; 3) accurate key deletes without permanent metadata.
Resumo:
The development of ubiquitous computing (ubicomp) environments raises several challenges in terms of their evaluation. Ubicomp virtual reality prototyping tools enable users to experience the system to be developed and are of great help to face those challenges, as they support developers in assessing the consequences of a design decision in the early phases of development. Given the situated nature of ubicomp environments, a particular issue to consider is the level of realism provided by the prototypes. This work presents a case study where two ubicomp prototypes, featuring different levels of immersion (desktop-based versus CAVE-based), were developed and compared. The goal was to determine the cost/benefits relation of both solutions, which provided better user experience results, and whether or not simpler solutions provide the same user experience results as more elaborate one.
Resumo:
Model finders are very popular for exploring scenarios, helping users validate specifications by navigating through conforming model instances. To be practical, the semantics of such scenario exploration operations should be formally defined and, ideally, controlled by the users, so that they are able to quickly reach interesting scenarios. This paper explores the landscape of scenario exploration operations, by formalizing them with a relational model finder. Several scenario exploration operations provided by existing tools are formalized, and new ones are proposed, namely to allow the user to easily explore very similar (or different) scenarios, by attaching preferences to model elements. As a proof-of-concept, such operations were implemented in the popular Alloy Analyzer, further increasing its usefulness for (user-guided) scenario exploration.
Resumo:
Temporal logics targeting real-time systems are traditionally undecidable. Based on a restricted fragment of MTL-R, we propose a new approach for the runtime verification of hard real-time systems. The novelty of our technique is that it is based on incremental evaluation, allowing us to e↵ectively treat duration properties (which play a crucial role in real-time systems). We describe the two levels of operation of our approach: offline simplification by quantifier removal techniques; and online evaluation of a three-valued interpretation for formulas of our fragment. Our experiments show the applicability of this mechanism as well as the validity of the provided complexity results.
Resumo:
This paper introduces the metaphorism pattern of relational specification and addresses how specification following this pattern can be refined into recursive programs. Metaphorisms express input-output relationships which preserve relevant information while at the same time some intended optimization takes place. Text processing, sorting, representation changers, etc., are examples of metaphorisms. The kind of metaphorism refinement proposed in this paper is a strategy known as change of virtual data structure. It gives sufficient conditions for such implementations to be calculated using relation algebra and illustrates the strategy with the derivation of quicksort as example.
Resumo:
Tese de doutoramento em ciências jurídicas públicas
Resumo:
Body and brain undergo several changes with aging. One of the domains in which these changes are more remarkable relates with cognitive performance. In the present work, electroencephalogram (EEG) markers (power spectral density and spectral coherence) of age-related cognitive decline were sought whilst the subjects performed the Wisconsin Card Sorting Test (WCST). Considering the expected age-related cognitive deficits, WCST was applied to young, mid-age and elderly participants, and the theta and alpha frequency bands were analyzed. From the results herein presented, higher theta and alpha power were found to be associated with a good performance in the WCST of younger subjects. Additionally, higher theta and alpha coherence were also associated with good performance and were shown to decline with age and a decrease in alpha peak frequency seems to be associated with aging. Additionally, inter-hemispheric long-range coherences and parietal theta power were identified as age-independent EEG correlates of cognitive performance. In summary, these data reveals age-dependent as well as age-independent EEG correlates of cognitive performance that contribute to the understanding of brain aging and related cognitive deficits.
Resumo:
Dissertação de Mestrado Integrado em Medicina Veterinária
Resumo:
The results of larval collections of mosquitoes from artificial containers and natural breeding at urban and rural areas carried out at Sertaneja, northern State Paraná, Brazil, from February to April, 1995, are presented. Among the 4534 immature forms collected, belonging to 21 species or species-groups, the species with higher density were Aedes aegypti (Linnaeus, 1762), Aedes albopictus (Skuse, 1894), Culex quinquefasciatus Say, 1823 and Limatus durhami Theobald, 1901.