896 resultados para computation- and data-intensive applications
Resumo:
Purpose. Health promotion policy frameworks, recent theorizing, and research all emphasize understanding and mobilizing environmental influences to change particular health-related behaviors in specific settings. The workplace is a key environmental setting. The Checklist of Health Promotion Environments at Worksites (CHEW) was designed as a direct observation instrument to assess characteristics of worksite environments that are known to influence health-related behaviors. Methods. The CHEW is a 112-item checklist of workplace environment features hypothesized to be associated, both positively and negatively, with physical activity, healthy eating, alcohol consumption, and smoking. The three environmental domains assessed are (1) physical characteristics of the worksite, (2) features of the information environment, and (3) characteristics of the immediate neighborhood around the workplace. The conceptual rationale and development studies for the CHEW are described, and data from observational studies of 20 worksites are reported. Results. The data on CHEW-derived environmental attributes showed generally good reliability and identified meaningful sets of variables that plausibly may influence health-related behaviors. With the exception of one information environment attribute, intraclass correlation coefficients ranged from 0.80 to 1.00. Descriptive statistics on selected physical and information environment characteristics indicated that vending machines, showers, bulletin boards, and signs prohibiting smoking were common across worksites. Bicycle racks, visible stairways, and signs related to alcohol consumption, nutrition, and health. promotion were relatively uncommon. Conclusions. These findings illustrate the types of data on environmental attributes that can be derived, their relevance for program planning, and how they can characterize variability across worksites. The CHEW is a promising observational measure that has the potential to assess environmental influences on health behaviors and to evaluate workplace health promotion programs.
Resumo:
Purpose: This study measured reliability between stroke patients' and significant others' scores on items on the Reintegration to Normal Living (RNL) Index and whether there were any scoring biases. Method The 11-item RNL Index was administered to 57 pairs of patients and significants six months after stroke rehabilitation. The index was scored using a 10-point visual analogue scale. Patient and significant other demographic information and data on patients' clinical, functional and cognitive status were collected. Reliability was measured using the intra-class correlation coefficient (ICC) and percent agreement. Results: Overall poor reliability was found for the RNL Index total score (ICC=.36, 95% CI. 07 to .59) and the daily functioning subscale (ICC=.24, 95% CI -.003 to .46) and moderate reliability was found for the perception of self subscale (ICC=.55, 95 % CI .28 to .73). There was a moderate bias for patients to rate themselves as achieving better reintegration than was indicated by significant others, although no demographic or clinical factors were associated with this bias. Exact match agreement was best for the subjective items and worse for items reflecting mobility around the community and participation in a work activity. Conclusions: Caution is needed when interpreting patient information reported by significant others on the RNL Index. The use of a shorter scale to rate the RNL Index requires investigation.
Resumo:
Fiber meshes of poly(hydroxybutyrate) (PHB) and poly(hydroxybutyrate)/ poly(ethylene oxide) (PHB/PEO) with different concentrations of chlorhexidine (CHX) were prepared by electrospinning, for assessment as a polymer based drug delivery system. The electrospun fibers were characterized at morphological, molecular and mechanical levels. The bactericidal potential of PHB and PHB/PEO electrospun fibers with and without CHX was investigated against Escherichia coli (E. coli) and Staphylococcus aureus (S. aureus) by disk diffusion susceptibility tests. Electrospun fibers containing CHX exhibited bactericidal activity. PHB/PEO-1%CHX displayed higher CHX release levels and equivalent antibacterial activity when compared to PHB/PEO with 5 and 10 wt% CHX. Bactericidal performance of samples with 1 wt% CHX was assessed by Colony Forming Units (CFU), where a reduction of 100 % and 99.69 % against E. coli and S. aureus were achieved, respectively.
Resumo:
The objective of this study was to analyze the environmental performance of aquaculture in the city of Colorado do Oeste, Rondônia State, Brazil. Fifteen fish farmers were interviewed. For data collection, structured interviews were carried out, using a questionnaire based on information supplied by the United Nations Food and Agriculture Organization (FAO). The questionnaire considered 12 items, organized into three main topics: a) social and legal standards b) environmental standards c) standards of food safety and hygiene. The questionnaire considered 12 items, organized into three main topics: a) social and legal standards b) environmental standards c) standards of food safety and hygiene. Aquaculture in the city of Colorado do Oeste, Rondônia presents two fish production systems: extensive and semi-intensive. In the semi-intensive system, stocking rate was one fish per m3, on average; tambaqui (Colossoma macropomum), tilapias (Oreochromis spp.), pirarucu (Arapaima gigas) and pintado (Pseudoplatystoma spp.) were the species farmed at the largest number. The rate of water renewal was due to the greater availability of natural food in this system. Water renewal was constant in the ponds (1,500 liters per minute). In the semi-intensive system using dug ponds, alevins were stocked and fed during the entire rearing time with natural and exogenous food. The extensive system relied on the natural production of the pond, with stocking density limited by the production of natural food. The little renewal of water made the cultivation tank itself acted as a decantation lake, with the occurrence of oxidation and sedimentation of residual organic matter, consisting of feces, debris and organic fertilizer. Production of reduced effluent volume took place in the extensive system, compared to the cultivation area. In addition, there was high water turbidity, caused by high concentration of planktonic organisms, and low concentrations of dissolved oxygen in the water. Data showed that nine estates of the interviewed fish farmers had critical environmental performance (less than 30.0%). Six estates of fish farmers had bad environmental performance (between 30.0 and 50.0%) (Coefficient of sustentainability = green square x 100 ÷ Total Questions less the yellow squares)
Resumo:
In this article we argue that digital simulations promote and explore complex relations between the player and the machines cybernetic system with which it relates through gameplay, that is, the real application of tactics and strategies used by participants as they play the game. We plan to show that the realism of simulation, together with the merger of artificial objects with the real world, can generate interactive empathy between players and their avatars. In this text, we intend to explore augmented reality as a means to visualise interactive communication projects. With ARToolkit, Virtools and 3ds Max applications, we aim to show how to create a portable interactive platform that resorts to the environment and markers for constructing the games scenario. Many of the conventional functions of the human eye are being replaced by techniques where images do not position themselves in the traditional manner that we observe them (Crary, 1998), or in the way we perceive the real world. The digitalization of the real world to a new informational layer over objects, people or environments, needs to be processed and mediated by tools that amplify the natural human senses.
Resumo:
In the last years there has been a huge growth and consolidation of the Data Mining field. Some efforts are being done that seek the establishment of standards in the area. Included on these efforts there can be enumerated SEMMA and CRISP-DM. Both grow as industrial standards and define a set of sequential steps that pretends to guide the implementation of data mining applications. The question of the existence of substantial differences between them and the traditional KDD process arose. In this paper, is pretended to establish a parallel between these and the KDD process as well as an understanding of the similarities between them.
Resumo:
In practical applications of optimization it is common to have several conflicting objective functions to optimize. Frequently, these functions are subject to noise or can be of black-box type, preventing the use of derivative-based techniques. We propose a novel multiobjective derivative-free methodology, calling it direct multisearch (DMS), which does not aggregate any of the objective functions. Our framework is inspired by the search/poll paradigm of direct-search methods of directional type and uses the concept of Pareto dominance to maintain a list of nondominated points (from which the new iterates or poll centers are chosen). The aim of our method is to generate as many points in the Pareto front as possible from the polling procedure itself, while keeping the whole framework general enough to accommodate other disseminating strategies, in particular, when using the (here also) optional search step. DMS generalizes to multiobjective optimization (MOO) all direct-search methods of directional type. We prove under the common assumptions used in direct search for single objective optimization that at least one limit point of the sequence of iterates generated by DMS lies in (a stationary form of) the Pareto front. However, extensive computational experience has shown that our methodology has an impressive capability of generating the whole Pareto front, even without using a search step. Two by-products of this paper are (i) the development of a collection of test problems for MOO and (ii) the extension of performance and data profiles to MOO, allowing a comparison of several solvers on a large set of test problems, in terms of their efficiency and robustness to determine Pareto fronts.
Resumo:
The operation of power systems in a Smart Grid (SG) context brings new opportunities to consumers as active players, in order to fully reach the SG advantages. In this context, concepts as smart homes or smart buildings are promising approaches to perform the optimization of the consumption, while reducing the electricity costs. This paper proposes an intelligent methodology to support the consumption optimization of an industrial consumer, which has a Combined Heat and Power (CHP) facility. A SCADA (Supervisory Control and Data Acquisition) system developed by the authors is used to support the implementation of the proposed methodology. An optimization algorithm implemented in the system in order to perform the determination of the optimal consumption and CHP levels in each instant, according to the Demand Response (DR) opportunities. The paper includes a case study with several scenarios of consumption and heat demand in the context of a DR event which specifies a maximum demand level for the consumer.
Resumo:
Many of the most common human functions such as temporal and non-monotonic reasoning have not yet been fully mapped in developed systems, even though some theoretical breakthroughs have already been accomplished. This is mainly due to the inherent computational complexity of the theoretical approaches. In the particular area of fault diagnosis in power systems however, some systems which tried to solve the problem, have been deployed using methodologies such as production rule based expert systems, neural networks, recognition of chronicles, fuzzy expert systems, etc. SPARSE (from the Portuguese acronym, which means expert system for incident analysis and restoration support) was one of the developed systems and, in the sequence of its development, came the need to cope with incomplete and/or incorrect information as well as the traditional problems for power systems fault diagnosis based on SCADA (supervisory control and data acquisition) information retrieval, namely real-time operation, huge amounts of information, etc. This paper presents an architecture for a decision support system, which can solve the presented problems, using a symbiosis of the event calculus and the default reasoning rule based system paradigms, insuring soft real-time operation with incomplete, incorrect or domain incoherent information handling ability. A prototype implementation of this system is already at work in the control centre of the Portuguese Transmission Network.
Resumo:
Smart grids are envisaged as infrastructures able to accommodate all centralized and distributed energy resources (DER), including intensive use of renewable and distributed generation (DG), storage, demand response (DR), and also electric vehicles (EV), from which plug-in vehicles, i.e. gridable vehicles, are especially relevant. Moreover, smart grids must accommodate a large number of diverse types or players in the context of a competitive business environment. Smart grids should also provide the required means to efficiently manage all these resources what is especially important in order to make the better possible use of renewable based power generation, namely to minimize wind curtailment. An integrated approach, considering all the available energy resources, including demand response and storage, is crucial to attain these goals. This paper proposes a methodology for energy resource management that considers several Virtual Power Players (VPPs) managing a network with high penetration of distributed generation, demand response, storage units and network reconfiguration. The resources are controlled through a flexible SCADA (Supervisory Control And Data Acquisition) system that can be accessed by the evolved entities (VPPs) under contracted use conditions. A case study evidences the advantages of the proposed methodology to support a Virtual Power Player (VPP) managing the energy resources that it can access in an incident situation.
Resumo:
Currently, power systems (PS) already accommodate a substantial penetration of distributed generation (DG) and operate in competitive environments. In the future, as the result of the liberalisation and political regulations, PS will have to deal with large-scale integration of DG and other distributed energy resources (DER), such as storage and provide market agents to ensure a flexible and secure operation. This cannot be done with the traditional PS operational tools used today like the quite restricted information systems Supervisory Control and Data Acquisition (SCADA) [1]. The trend to use the local generation in the active operation of the power system requires new solutions for data management system. The relevant standards have been developed separately in the last few years so there is a need to unify them in order to receive a common and interoperable solution. For the distribution operation the CIM models described in the IEC 61968/70 are especially relevant. In Europe dispersed and renewable energy resources (D&RER) are mostly operated without remote control mechanisms and feed the maximal amount of available power into the grid. To improve the network operation performance the idea of virtual power plants (VPP) will become a reality. In the future power generation of D&RER will be scheduled with a high accuracy. In order to realize VPP decentralized energy management, communication facilities are needed that have standardized interfaces and protocols. IEC 61850 is suitable to serve as a general standard for all communication tasks in power systems [2]. The paper deals with international activities and experiences in the implementation of a new data management and communication concept in the distribution system. The difficulties in the coordination of the inconsistent developed in parallel communication and data management standards - are first addressed in the paper. The upcoming unification work taking into account the growing role of D&RER in the PS is shown. It is possible to overcome the lag in current practical experiences using new tools for creating and maintenance the CIM data and simulation of the IEC 61850 protocol – the prototype of which is presented in the paper –. The origin and the accuracy of the data requirements depend on the data use (e.g. operation or planning) so some remarks concerning the definition of the digital interface incorporated in the merging unit idea from the power utility point of view are presented in the paper too. To summarize some required future work has been identified.