908 resultados para Database As A Tool For Hospitality Management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The neural control of the cardiovascular system is a complex process that involves many structures at different levels of nervous system. Several cortical areas are involved in the control of systemic blood pressure, such as the sensorimotor cortex, the medial prefrontal cortex and the insular cortex. Non-invasive brain stimulation techniques - repetitive transcranial magnetic stimulation (rTMS) and transcranial direct current stimulation (tDCS) - induce sustained and prolonged functional changes of the human cerebral cortex. rTMS and tDCS has led to positive results in the treatment of some neurological and psychiatric disorders. Because experiments in animals show that cortical modulation can be an effective method to regulate the cardiovascular system, non-invasive brain stimulation might be a novel tool in the therapeutics of human arterial hypertension. We here review the experimental evidence that non-invasive brain stimulation can influence the autonomic nervous system and discuss the hypothesis that focal modulation of cortical excitability by rTMS or tDCS can influence sympathetic outflow and, eventually, blood pressure, thus providing a novel therapeutic tool for human arterial hypertension. (C) 2009 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to use GIS integration data to characterize sedimentary processes in a SubTropical lagoon environment. The study area was the Canan,ia Inlet estuary in the southeastern section of the Canan,ia Lagoon Estuarine System (CLES), state of So Paulo, Brazil (25A degrees 03'S/47A degrees 53'W). The area is formed by the confluence of two estuarine channels forming a bay-shaped water body locally called "Trapand, Bay". The region is surrounded by one of the most preserved tracts of Atlantic Rain Forest in Southwestern Brazil and presents well-developed mangroves and marshes. In this study a methodology was developed using integrated a GIS database based on bottom sediment parameters, geomorphological data, remote sensing images, Hidrodynamical Modeling data and geophysical parameters. The sediment grain size parameters and the bottom morphology of the lagoon were also used to develop models of net sediment transport pathways. It was possible to observe that the sediment transport vectors based on the grain size model had a good correlation with the transport model based on the bottom topography features and Hydrodynamic model, especially in areas with stronger energetic conditions, with a minor contribution of finer sediments. This relation is somewhat less evident near shallower banks and depositional features. In these regions the organic matter contents in the sediments was a good complementary tool for inferring the hydrodynamic and depositional conditions (i.e. primary productivity, sedimentation rates, sources, oxi-reduction rates).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The CIPESC (R) is a tool that informs the work of nurses in Public Health and assists in prioritizing their care in practice, management and research. It is also a powerful pedagogical instrument for the qualification of nurses within the Brazilian healthcare system. In the teaching of infectious diseases, using the CIPESC (R) assists in analyzing the interventions by encouraging clinical and epidemiological thinking regarding the health-illness process. With the purpose in mind of developing resources for teaching undergraduate nursing students and encouraging reflection regarding the process of nursing work, this article presents an experimental application of CIPESC (R), using meningococcal meningitis as an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research intended to analyze the adoption process of the green certification "Leadership in Energy and Environmental Design" (LEED) from the hotel sector establishments that has already adopted it. For its concretization it was proceeded a bibliographical research, secondary fact-gathering in journals, institutional sites and documentaries, and primary fact-gathering by means of semi structured interviews carried out with responsible people of the certified hotels and of the responsible entity of the certification in Brazil (Green Building Council Brazil). There were 21 interviewee, being 02 of the GBC Brazil and 19 of means of lodging (31% of the certified). For data analysis, it was utilized content analysis technique with the aid of ATLAS.ti software. The results permitted to identify the chronology of the processes of certification and the profile of the hotel categories that adopt the LEED program. Beyond that, the interviews enabled the discussion of the initial motivations for seeking the certification, as well the advantages and the obstacles perceived regarding its adoption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to recent research carried out in the foundry sector, one of the most important concerns of the industries is to improve their production planning. A foundry production plan involves two dependent stages: (1) determining the alloys to be merged and (2) determining the lots that will be produced. The purpose of this study is to draw up plans of minimum production cost for the lot-sizing problem for small foundries. As suggested in the literature, the proposed heuristic addresses the problem stages in a hierarchical way. Firstly, the alloys are determined and, subsequently, the items that are produced from them. In this study, a knapsack problem as a tool to determine the items to be produced from furnace loading was proposed. Moreover, we proposed a genetic algorithm to explore some possible sets of alloys and to determine the production planning for a small foundry. Our method attempts to overcome the difficulties in finding good production planning presented by the method proposed in the literature. The computational experiments show that the proposed methods presented better results than the literature. Furthermore, the proposed methods do not need commercial software, which is favorable for small foundries. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern sugarcane cultivars are complex hybrids resulting from crosses among several Saccharum species. Traditional breeding methods have been employed extensively in different countries over the past decades to develop varieties with increased sucrose yield and resistance to pests and diseases. Conventional variety improvement, however, may be limited by the narrow pool of suitable genes. Thus, molecular genetics is seen as a promising tool to assist in the process of developing improved varieties. The SUCEST-FUN Project (http://sucest-fun.org) aims to associate function with sugarcane genes using a variety of tools, in particular those that enable the study of the sugarcane transcriptome. An extensive analysis has been conducted to characterise, phenotypically, sugarcane genotypes with regard to their sucrose content, biomass and drought responses. Through the analysis of different cultivars, genes associated with sucrose content, yield, lignin and drought have been identified. Currently, tools are being developed to determine signalling and regulatory networks in grasses, and to sequence the sugarcane genome, as well as to identify sugarcane promoters. This is being implemented through the SUCEST-FUN (http://sucest-fun.org) and GRASSIUS databases (http://grassius.org), the cloning of sugarcane promoters, the identification of cis-regulatory elements (CRE) using Chromatin Immunoprecipitation-sequencing (ChIP-Seq) and the generation of a comprehensive Signal Transduction and Transcription gene catalogue (SUCAST Catalogue).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to undertake a critical reflection regarding assessment as a managerial tool that promotes the inclusion of nurses in the health system management process. Nurses, because of their education and training, which encompasses knowledge in both the clinical and managerial fields and is centered on care, have the potential to assume a differentiated attitude in management, making decisions and proposing health policies. Nevertheless, it is necessary to first create and consolidate an expressive inclusion in decisive levels of management. Assessment is a component of management, the results of which may contribute to making decisions that are more objective and allow for improving healthcare interventions and reorganizing health practice within a political, economic, social and professional context; it is also an area for the application of knowledge that has the potential to change the current panorama of including nurses in management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Handling Totally Implantable Access Ports (TIAP) is a nursing procedure that requires skill and knowledge to avoid adverse events. No studies addressing this procedure with undergraduate students were identified prior to this study. Communication technologies, such as videos, have been increasingly adopted in the teaching of nursing and have contributed to the acquisition of competencies for clinical performance. Objective: To evaluate the effect of a video on the puncture and heparinization of TIAP in the development of cognitive and technical competencies of undergraduate nursing students. Method: Quasi-experimental study with a pretest-posttest design. Results: 24 individuals participated in the study. Anxiety scores were kept at levels 1 and 2 in the pretest and posttest. In relation to cognitive knowledge concerning the procedure, the proportion of correct answers in the pretest was 0.14 (SD=0.12) and 0.90 in the posttest (SD=0.05). After watching the video, the average score obtained by the participants in the mock session was 27.20. Conclusion: The use of an educational video with a simulation of puncture and heparinization of TIAP proved to be a strategy that increased both cognitive and technical knowledge. This strategy is viable in the teaching-learning process and is useful as a support tool for professors and for the development of undergraduate nursing students. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

These updated guidelines are based on a first edition of the World Federation of Societies of Biological Psychiatry Guidelines for Biological Treatment of Schizophrenia published in 2005. For this 2012 revision, all available publications pertaining to the biological treatment of schizophrenia were reviewed systematically to allow for an evidence-based update. These guidelines provide evidence-based practice recommendations that are clinically and scientifically meaningful and these guidelines are intended to be used by all physicians diagnosing and treating people suffering from schizophrenia. Based on the first version of these guidelines, a systematic review of the MEDLINE/PUBMED database and the Cochrane Library, in addition to data extraction from national treatment guidelines, has been performed for this update. The identified literature was evaluated with respect to the strength of evidence for its efficacy and then categorised into six levels of evidence (A-F; Bandelow et al. 2008b, World J Biol Psychiatry 9: 242). This first part of the updated guidelines covers the general descriptions of antipsychotics and their side effects, the biological treatment of acute schizophrenia and the management of treatment-resistant schizophrenia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. Results We have implemented an extension of Chado – the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Conclusions Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different “omics” technologies with patient’s clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans webcite.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Universitat de Barcelona

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scale wireless adhoc networks of computers, sensors, PDAs etc. (i.e. nodes) are revolutionizing connectivity and leading to a paradigm shift from centralized systems to highly distributed and dynamic environments. An example of adhoc networks are sensor networks, which are usually composed by small units able to sense and transmit to a sink elementary data which are successively processed by an external machine. Recent improvements in the memory and computational power of sensors, together with the reduction of energy consumptions, are rapidly changing the potential of such systems, moving the attention towards datacentric sensor networks. A plethora of routing and data management algorithms have been proposed for the network path discovery ranging from broadcasting/floodingbased approaches to those using global positioning systems (GPS). We studied WGrid, a novel decentralized infrastructure that organizes wireless devices in an adhoc manner, where each node has one or more virtual coordinates through which both message routing and data management occur without reliance on either flooding/broadcasting operations or GPS. The resulting adhoc network does not suffer from the deadend problem, which happens in geographicbased routing when a node is unable to locate a neighbor closer to the destination than itself. WGrid allow multidimensional data management capability since nodes' virtual coordinates can act as a distributed database without needing neither special implementation or reorganization. Any kind of data (both single and multidimensional) can be distributed, stored and managed. We will show how a location service can be easily implemented so that any search is reduced to a simple query, like for any other data type. WGrid has then been extended by adopting a replication methodology. We called the resulting algorithm WRGrid. Just like WGrid, WRGrid acts as a distributed database without needing neither special implementation nor reorganization and any kind of data can be distributed, stored and managed. We have evaluated the benefits of replication on data management, finding out, from experimental results, that it can halve the average number of hops in the network. The direct consequence of this fact are a significant improvement on energy consumption and a workload balancing among sensors (number of messages routed by each node). Finally, thanks to the replications, whose number can be arbitrarily chosen, the resulting sensor network can face sensors disconnections/connections, due to failures of sensors, without data loss. Another extension to {WGrid} is {W*Grid} which extends it by strongly improving network recovery performance from link and/or device failures that may happen due to crashes or battery exhaustion of devices or to temporary obstacles. W*Grid guarantees, by construction, at least two disjoint paths between each couple of nodes. This implies that the recovery in W*Grid occurs without broadcasting transmissions and guaranteeing robustness while drastically reducing the energy consumption. An extensive number of simulations shows the efficiency, robustness and traffic road of resulting networks under several scenarios of device density and of number of coordinates. Performance analysis have been compared to existent algorithms in order to validate the results.