937 resultados para system analysis
Resumo:
Increased professionalism in rugby has elicited rapid changes in the fitness profile of elite players. Recent research, focusing on the physiological and anthropometrical characteristics of rugby players, and the demands of competition are reviewed. The paucity of research on contemporary elite rugby players is highlighted, along with the need for standardised testing protocols. Recent data reinforce the pronounced differences in the anthropometric and physical characteristics of the forwards and backs. Forwards are typically heavier, taller, and have a greater proportion of body fat than backs. These characteristics are changing, with forwards developing greater total mass and higher muscularity. The forwards demonstrate superior absolute aerobic and anaerobic power, and Muscular strength. Results favour the backs when body mass is taken into account. The scaling of results to body mass can be problematic and future investigations should present results using power function ratios. Recommended tests for elite players include body mass and skinfolds, vertical jump, speed, and the multi-stage shuttle run. Repeat sprint testing is a possible avenue for more specific evaluation of players. During competition, high-intensity efforts are often followed by periods of incomplete recovery. The total work over the duration of a game is lower in the backs compared with the forwards; forwards spend greater time in physical contact with the opposition while the backs spend more time in free running, allowing them to cover greater distances. The intense efforts undertaken by rugby players place considerable stress on anaerobic energy sources, while the aerobic system provides energy during repeated efforts and for recovery. Training should focus on repeated brief high-intensity efforts with short rest intervals to condition players to the demands of the game. Training for the forwards should emphasise the higher work rates of the game, while extended rest periods can be provided to the backs. Players should not only be prepared for the demands of competition, but also the stress of travel and extreme environmental conditions. The greater professionalism of rugby union has increased scientific research in the sport; however, there is scope for significant refinement of investigations on the physiological demands of the game, and sports-specific testing procedures.
Resumo:
Primary objective : To investigate the speed and accuracy of tongue movements exhibited by a sample of children with dysarthria following severe traumatic brain injury (TBI) during speech using electromagnetic articulography (EMA). Methods and procedures : Four children, aged between 12.75-17.17 years with dysarthria following TBI, were assessed using the AG-100 electromagnetic articulography system (Carstens Medizinelektronik). The movement trajectories of receiver coils affixed to each child's tongue were examined during consonant productions, together with a range of quantitative kinematic parameters. The children's results were individually compared against the mean values obtained by a group of eight control children (mean age of 14.67 years, SD 1.60). Main outcomes and results : All four TBI children were perceived to exhibit reduced rates of speech and increased word durations. Objective EMA analysis revealed that two of the TBI children exhibited significantly longer consonant durations compared to the control group, resulting from different underlying mechanisms relating to speed generation capabilities and distances travelled. The other two TBI children did not exhibit increased initial consonant movement durations, suggesting that the vowels and/or final consonants may have been contributing to the increased word durations. Conclusions and clinical implications : The finding of different underlying articulatory kinematic profiles has important implications for the treatment of speech rate disturbances in children with dysarthria following TBI.
Resumo:
An increasing number of studies shows that the glycogen-accumulating organisms (GAOs) can survive and may indeed proliferate under the alternating anaerobic/aerobic conditions found in EBPR systems, thus forming a strong competitor of the polyphosphate-accumulating organisms (PAOs). Understanding their behaviors in a mixed PAO and GAO culture under various operational conditions is essential for developing operating strategies that disadvantage the growth of this group of unwanted organisms. A model-based data analysis method is developed in this paper for the study of the anaerobic PAO and GAO activities in a mixed PAO and GAO culture. The method primarily makes use of the hydrogen ion production rate and the carbon dioxide transfer rate resulting from the acetate uptake processes by PAOs and GAOs, measured with a recently developed titration and off-gas analysis (TOGA) sensor. The method is demonstrated using the data from a laboratory-scale sequencing batch reactor (SBR) operated under alternating anaerobic and aerobic conditions. The data analysis using the proposed method strongly indicates a coexistence of PAOs and GAOs in the system, which was independently confirmed by fluorescent in situ hybridization (FISH) measurement. The model-based analysis also allowed the identification of the respective acetate uptake rates by PAOs and GAOs, along with a number of kinetic and stoichiometric parameters involved in the PAO and GAO models. The excellent fit between the model predictions and the experimental data not involved in parameter identification shows that the parameter values found are reliable and accurate. It also demonstrates that the current anaerobic PAO and GAO models are able to accurately characterize the PAO/GAO mixed culture obtained in this study. This is of major importance as no pure culture of either PAOs or GAOs has been reported to date, and hence the current PAO and GAO models were developed for the interpretation of experimental results of mixed cultures. The proposed method is readily applicable for detailed investigations of the competition between PAOs and GAOs in enriched cultures. However, the fermentation of organic substrates carried out by ordinary heterotrophs needs to be accounted for when the method is applied to the study of PAO and GAO competition in full-scale sludges. (C) 2003 Wiley Periodicals, Inc.
Resumo:
Biological nitrogen removal via nitrite pathway in wastewater treatment is very important especially in the cost of aeration and as an electron donor for denitrification. Wastewater nitrification and nitrite accumulations were carried out in a biofilm reactor. The biofilm reactor showed almost complete nitrification and most of the oxidized ammonium was present as nitrite at the ammonium load of 1.2 kg N/m3/d. Nitrite accumulation was achieved by the selective inhibition of nitrite oxidizers by free ammonia and oxygen limitation. Nitrite oxidation activity was recovered as soon as the inhibition factor was removed. Fluorescence in situ hybridization studies of the nitrite accumulating biofilm system have shown that genus Nitrosomonas which is specifically hybridized with probe NSM 156 was the dominant nitrifying bacteria while Nitrospira was less abundant than those of normal nitrification systems. Further FISH analysis showed that the combinations of Nitrosomonas and Nitrospira cells were identified as important populations of nitrifying bacteria in an autotrophic nitrifying biofilm system.
Resumo:
This paper discusses a document discovery tool based on Conceptual Clustering by Formal Concept Analysis. The program allows users to navigate e-mail using a visual lattice metaphor rather than a tree. It implements a virtual. le structure over e-mail where files and entire directories can appear in multiple positions. The content and shape of the lattice formed by the conceptual ontology can assist in e-mail discovery. The system described provides more flexibility in retrieving stored e-mails than what is normally available in e-mail clients. The paper discusses how conceptual ontologies can leverage traditional document retrieval systems and aid knowledge discovery in document collections.
Resumo:
The utility of 16s rDNA restriction fragment length polymorphism (RFLP) analysis for the partial genomovar differentiation of Burkholderia cepacia complex bacterium is well documented. We compared the 16s rDNA RFLP signatures for a number of non-fermenting gram negative bacilli (NF GNB) LMG control strains and clinical isolates pertaining to the genera Burkholderia, Pseudomonas, Achromobacter (Alcaligenes), Ralstonia, Stenotrophomonas and Pandoraea. A collection of 24 control strain (LMG) and 25 clinical isolates were included in the study. Using conventional PCR, a 1.2 kbp 16s rDNA fragment was generated for each organism. Following restriction digestion and electrophoresis, each clinical isolate RFLP signature was compared to those of the control strain panel. Nineteen different RFLP signatures were detected from the 28 control strains included in the study. TwentyoneyTwenty- five of the clinical isolates could be classified by RFLP analysis into a single genus and species when compared to the patterns produced by the control strain panel. Four clinical B. pseudomallei isolates produced RFLP signatures which were indistinguishable from B. cepacia genomovars I, III and VIII. The identity of these four isolates were confirmed using B. pseudomallei specific PCR. 16s rDNA RFLP analysis can be a useful identification strategy when applied to NF GNB, particularly for those which exhibit colistin sulfate resistance. The use of this molecular based methodology has proved very useful in the setting of a CF referral laboratory particularly when utilised in conjunction with B. cepacia complex and genomovar specific PCR techniques. Species specific PCR or sequence analysis should be considered for selected isolates; especially where discrepancies between epidemiology, phenotypic and genotypic characteristics occur.
Resumo:
Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.
Resumo:
This work discusses the use of optical flow to generate the sensorial information a mobile robot needs to react to the presence of obstacles when navigating in a non-structured environment. A sensing system based on optical flow and time-to-collision calculation is here proposed and experimented, which accomplishes two important paradigms. The first one is that all computations are performed onboard the robot, in spite of the limited computational capability available. The second one is that the algorithms for optical flow and time-to-collision calculations are fast enough to give the mobile robot the capability of reacting to any environmental change in real-time. Results of real experiments in which the sensing system here proposed is used as the only source of sensorial data to guide a mobile robot to avoid obstacles while wandering around are presented, and the analysis of such results allows validating the proposed sensing system.
Resumo:
Purpose Achieving sustainability by rethinking products, services and strategies is an enormous challenge currently laid upon the economic sector, in which materials selection plays a critical role. In this context, the present work describes an environmental and economic life cycle analysis of a structural product, comparing two possible material alternatives. The product chosen is a storage tank, presently manufactured in stainless steel (SST) or in a glass fibre reinforced polymer composite (CST). The overall goal of the study is to identify environmental and economic strong and weak points related to the life cycle of the two material alternatives. The consequential win-win or trade-off situations will be identified via a Life Cycle Assessment/Life Cycle Costing (LCA/LCC) integrated model. Methods The LCA/LCC integrated model used consists in applying the LCA methodology to the product system, incorporating, in parallel, its results into the LCC study, namely those of the Life Cycle Inventory (LCI) and the Life Cycle Impact Assessment (LCIA). Results In both the SST and CST systems the most significant life cycle phase is the raw materials production, in which the most significant environmental burdens correspond to the Fossil fuels and Respiratory inorganics categories. The LCA/LCC integrated analysis shows that the CST has globally a preferable environmental and economic profile, as its impacts are lower than those of the SST in all life cycle stages. Both the internal and external costs are lower, the former resulting mainly from the composite material being significantly less expensive than stainless steel. This therefore represents a full win-win situation. As a consequence, the study clearly indicates that using a thermoset composite material to manufacture storage tanks is environmentally and economically desirable. However, it was also evident that the environmental performance of the CST could be improved by altering its End-of-Life stage. Conclusions The results of the present work provide enlightening insights into the synergies between the environmental and the economic performance of a structural product made with alternative materials. Further, they provide conclusive evidence to support the integration of environmental and economic life cycle analysis in the product development processes of a manufacturing company, or in some cases even in its procurement practices.
Resumo:
Graphical user interfaces (GUIs) are critical components of todays software. Given their increased relevance, correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing systems. We use static analysis techniques to generate models of the user interface behaviour from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particularly type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
This paper explores the main determinants of the use of the cost accounting system (CAS) in Portuguese local government (PLG). Regression analysis is used to study the fit of a model of accounting changes in PLG, focused on cost accounting systems oriented to activities and outputs. Based on survey data gathered from PLG, we have found that the use of information in decision-making and external reporting is still a mirage. We obtain evidence about the influence of the internal organizational context (especially the lack of support and difficulties in the CAS implementation) in the use for internal purposes, while the institutional environment (like external pressures to implement the CAS) appears to be more deterministic of the external use. Results strengthen the function of external reporting to legitimate the organization’s activities to external stakeholders. On the other hand, some control variables (like political competition, usefulness and experience) also evidence some explanatory power in the model. Some mixed results were found that appeal to further research in the future. Our empirical results contribute to understand the importance of interconnecting the contingency and institutional approaches to gain a clear picture of cost accounting changes in the public sector.
Resumo:
Graphical user interfaces (GUIs) are critical components of today's open source software. Given their increased relevance, the correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing open source systems. We use static analysis techniques to generate models of the user interface behavior from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particular type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.