949 resultados para Requirements management
Resumo:
This descriptive study addresses the job satisfaction of nurse managers and clinical nurses working at the Hematology and Hemotherapy Services of a public hospital in Sao Paulo. The study objectives were to identify the factors that caused job satisfaction among nurse managers and clinical nurses, and support the results in the development of indicators to evaluate the quality of nursing human resource management. The components of the study were: autonomy, interaction, professional status, job requirements, organizational norms and remuneration. Participants were 44 nurses. Data were collected using a Job Satisfaction Index (JSI) questionnaire. In conclusion, this study permitted the identification of the clinical nurse group, which was the most satisfied, with a JSI of 10.5; the managerial group scored 10.0. Regarding the satisfaction levels in regards to the current activity, 88.9% of the nurse managers reported feeling satisfied, as did 90.9% of clinical nurses. For both groups, autonomy was the component with the highest level of professional satisfaction.
Resumo:
The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.
Resumo:
This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.
Resumo:
With advances in pediatric cardiology and cardiac surgery, the population of adults with congenital heart disease (CHD) has increased. In the current era, there are more adults with CHD than children. This population has many unique issues and needs. They have distinctive forms of heart failure and their cardiac disease can be associated with pulmonary hypertension, thromboemboli, complex arrhythmias and sudden death. Medical aspects that need to be considered relate to the long-term and multisystemic effects of single ventricle physiology, cyanosis, systemic right ventricles, complex intracardiac baffles and failing subpulmonary right ventricles. Since the 2001 Canadian Cardiovascular Society Consensus Conference report on the management of adults with CHD, there have been significant advances in the field of adult CHD. Therefore, new clinical guidelines have been written by Canadian adult CHD physicians in collaboration with an international panel of experts in the field. Part III of the guidelines includes recommendations for the care of patients with complete transposition of the great arteries, congenitally corrected transposition of the great arteries, Fontan operations and single ventricles, Eisenmenger's syndrome, and cyanotic heart disease. Topics addressed include genetics, clinical outcomes, recommended diagnostic workup, surgical and interventional options, treatment of arrhythmias, assessment of pregnancy risk and follow-up requirements. The complete document consists of four manuscripts, which are published online in the present issue of The Canadian Journal of Cardiology. The complete document and references can also be found at www.ccs.ca or www.cachnet.org.
Resumo:
With advances in pediatric cardiology and cardiac surgery, the population of adults with congenital heart disease (CHD) has increased. In the current era, there are more adults with CHD than children. This population has many unique issues and needs. They have distinctive forms of heart failure, and their cardiac disease can be associated with pulmonary hypertension, thromboemboli, complex arrhythmias and sudden death.Medical aspects that need to be considered relate to the long-term and multisystemic effects of single-ventricle physiology, cyanosis, systemic right ventricles, complex intracardiac baffles and failing subpulmonary right ventricles. Since the 2001 Canadian Cardiovascular Society Consensus Conference report on the management of adults with CHD, there have been significant advances in the understanding of the late outcomes, genetics, medical therapy and interventional approaches in the field of adult CHD. Therefore, new clinical guidelines have been written by Canadian adult CHD physicians in collaboration with an international panel of experts in the field. The present executive summary is a brief overview of the new guidelines and includes the recommendations for interventions. The complete document consists of four manuscripts that are published online in the present issue of The Canadian Journal of Cardiology, including sections on genetics, clinical outcomes, recommended diagnostic workup, surgical and interventional options, treatment of arrhythmias, assessment of pregnancy and contraception risks, and follow-up requirements. The complete document and references can also be found at www.ccs.ca or www.cachnet.org.
Resumo:
Electric power grids throughout the world suffer from serious inefficiencies associated with under-utilization due to demand patterns, engineering design and load following approaches in use today. These grids consume much of the world’s energy and represent a large carbon footprint. From material utilization perspectives significant hardware is manufactured and installed for this infrastructure often to be used at less than 20-40% of its operational capacity for most of its lifetime. These inefficiencies lead engineers to require additional grid support and conventional generation capacity additions when renewable technologies (such as solar and wind) and electric vehicles are to be added to the utility demand/supply mix. Using actual data from the PJM [PJM 2009] the work shows that consumer load management, real time price signals, sensors and intelligent demand/supply control offer a compelling path forward to increase the efficient utilization and carbon footprint reduction of the world’s grids. Underutilization factors from many distribution companies indicate that distribution feeders are often operated at only 70-80% of their peak capacity for a few hours per year, and on average are loaded to less than 30-40% of their capability. By creating strong societal connections between consumers and energy providers technology can radically change this situation. Intelligent deployment of smart sensors, smart electric vehicles, consumer-based load management technology very high saturations of intermittent renewable energy supplies can be effectively controlled and dispatched to increase the levels of utilization of existing utility distribution, substation, transmission, and generation equipment. The strengthening of these technology, society and consumer relationships requires rapid dissemination of knowledge (real time prices, costs & benefit sharing, demand response requirements) in order to incentivize behaviors that can increase the effective use of technological equipment that represents one of the largest capital assets modern society has created.
Resumo:
We appreciate the thorough discussion provided by Professor Yuan Ding. His comments raise legitimate issues. In this response, we offer clarifications and suggest avenues for future research. Our response follows the structure of the discussant’s paper and elaborates on each point separately.
Resumo:
We test for differences in financial reporting quality between companies that are required to file periodically with the SEC and those that are exempted from filing reports with the SEC under Rule 12g3-2(b). We examine three earnings quality measures: conservatism, abnormal accruals, and the predictability of earnings. Our results, for all three measures, show different financial reporting quality for companies that file with the SEC than for companies exempt from filing requirements. This paper provides empirical evidence of a link between filing with the SEC and financial reporting quality for foreign firms.
Resumo:
With research on Wireless Sensor Networks (WSNs) becoming more and more mature in the past five years, researchers from universities all over the world have set up testbeds of wireless sensor networks, in most cases to test and evaluate the real-world behavior of developed WSN protocol mechanisms. Although these testbeds differ heavily in the employed sensor node types and the general architectural set up, they all have similar requirements with respect to management and scheduling functionalities: as every shared resource, a testbed requires a notion of users, resource reservation features, support for reprogramming and reconfiguration of the nodes, provisions to debug and remotely reset sensor nodes in case of node failures, as well as a solution for collecting and storing experimental data. The TARWIS management architecture presented in this paper targets at providing these functionalities independent from node type and node operating system. TARWIS has been designed as a re-usable management solution for research and/or educational oriented research testbeds of wireless sensor networks, relieving researchers intending to deploy a testbed from the burden to implement their own scheduling and testbed management solutions from scratch.
Resumo:
Da sich Additive Manufacturing (AM) von traditionellen Produktionsverfahren unterscheidet, entstehen neue Möglichkeiten im Produktdesign und im Supply Chain Setup. Die Auswirkungen der Aufhebung traditionellen Restriktionen im Produktdesign werden unter dem Begriff „Design for Additive Manufacturing“ intensiv diskutiert. In gleicher Weise werden durch AM Restriktionen im traditionellen Supply Chain Setup aufgehoben. Insbesondere sind die folgenden Verbesserungen möglich: Reduktion von Losgrössen und Lieferzeiten, bedarfsgerechte Produktion auf Abruf, dezentrale Produktion, Customization auf Ebene Bauteil und kontinuierliche Weiterentwicklung von Bauteilen. Viele Firmen investieren nicht selbst in die AM Technologien, sondern kaufen Bauteile bei Lieferanten. Um das Potential der AM Supply Chain mit Lieferanten umzusetzen, entstehen die folgenden Anforderungen an AM Einkaufsprozesse. Erstens muss der Aufwand pro Bestellung reduziert werden. Zweitens brauchen AM Nutzer einen direkten Zugang zu den Lieferanten ohne Umweg über die Einkaufsabteilung. Drittens müssen geeignete AM Lieferanten einfach identifiziert werden können. Viertens muss der Wechsel von Lieferanten mit möglichst geringem Aufwand möglich sein. Ein mögliche Lösung sind AM spezifische E-Procurement System um diese Anforderungen zu erfüllen
Resumo:
Temporal data are a core element of a reservation. In this paper we formulate 10 requirements and 14 sub-requirements for handling temporal data in online hotel reservation systems (OHRS) from a usability viewpoint. We test the fulfillment of these requirements for city and resort hotels in Austria and Switzerland. Some of the requirements are widely met; however, many requirements are fulfilled only by a surprisingly small number of hotels. In particular, numerous systems offer options for selecting data which lead to error messages in the next step. A few screenshots illustrate flaws of the systems. We also draw conclusions on the state of applying software engineering principles in the development of Web pages.
Resumo:
Two experiments were conducted to evaluate the effects of body condition scores of beef calves on performance efficiency and carcass characteristics. In Experiment 1, 111 steer calves were stratified by breed and condition score (CS) and randomly allotted to 14 pens. The study was analyzed as a 2 x 3 factorial design, with two breeds (Angus and Simmental) and three initial CS (4.4, 5.1, and 5.6). In Experiment 2, 76 steer calves were allotted to six pens by CS. The resultant pens averaged 3.9, 4.5, 4.7, 5.0, 5.1, and 5.6 in CS. Calves in both studies were fed a corn-based finishing diet formulated to 13.5% crude protein. All calves were implanted with Synovex- SÒ initially and reimplanted with Revalor-SÒ. In Experiment 1, 29-day dry matter intake (lb/day) increased with CS (17.9, 18.1, and 19.1 for 4.4, 5.1, and 5.6, respectively; p < .04). Daily gain (29 days) tended to decrease with increasing CS (4.19, 3.71, and 3.26; p < .13). Days on feed decreased with increasing CS (185, 180, and 178d; p < .07). In Experiment 2, daily gains also increased with decreasing initial CS for the first 114 days (p < .05) and tended to increase overall (p < .20). In Experiment 1, calves with lower initial CS had less external fat at slaughter (.48, .53, and .61 in. for CS 4.4, 5.1, and 5.6, respectively; p < .05). This effect was also noted at slaughter (p < .10), as well as at 57 days (p < .06) and at 148 days (p < .06) as measured by real-time ultrasound. Measurements of intramuscular fat and marbling were not different in either study. These data suggest that CS of feeder calves may be a useful tool for adjusting energy requirements of calves based on body condition. Also, feeder cattle may be sorted into outcome or management groups earlier than currently practiced using body condition and/or real-time ultrasound.
Resumo:
Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.
Resumo:
The identification of plausible causes for water body status deterioration will be much easier if it can build on available, reliable, extensive and comprehensive biogeochemical monitoring data (preferably aggregated in a database). A plausible identification of such causes is a prerequisite for well-informed decisions on which mitigation or remediation measures to take. In this chapter, first a rationale for an extended monitoring programme is provided; it is then compared to the one required by the Water Framework Directive (WFD). This proposal includes a list of relevant parameters that are needed for an integrated, a priori status assessment. Secondly, a few sophisticated statistical tools are described that subsequently allow for the estiation of the magnitude of impairment as well as the likely relative importance of different stressors in a multiple stressed environment. The advantages and restrictions of these rather complicated analytical methods are discussed. Finally, the use of Decision Support Systems (DSS) is advocated with regard to the specific WFD implementation requirements.