977 resultados para operational modal analysis
Resumo:
Objective Assessing the accuracy of the defining characteristics (DC) of the nursing diagnosis Sedentary Lifestyle (SL) in people with hypertension. Method A cross-sectional study carried out in a referral center in the outpatient care of people with hypertension and diabetes, with a sample of 285 individuals. The form used in the study was designed from operational definitions constructed for each DC of the diagnosis. Four nurses with training to carry out diagnostic inferences did the clinical assessment for the presence of SL. Results The prevalence of SL was 55.8%. Regarding measures of accuracy, the main DC for SL was chooses a daily routine lacking physical exercise, with sensitivity of 100% and specificity of 84.13%. Two DC stood out in the logistic regression, namely: reports preference for activities low in physical activity and poor performance in instrumental activities of daily living (IADL). Conclusion The results allowed identifying the best clinical indicators for SL in hypertensive adults.
Resumo:
Amplified ribosomal DNA restriction analysis (ARDRA) is a simple method based on restriction endonuclease digestion of the amplified bacterial 16S rDNA. In this study we have evaluated the suitability of this method to detect differences in activated sludge bacterial communities fed on domestic or industrial wastewater, and subject to different operational conditions. The ability of ARDRA to detect these differences has been tested in modified Ludzack-Ettinger (MLE) configurations. Samples from three activated sludge wastewater treatment plants (WWTPs) with the MLE configuration were collected for both oxic and anoxic reactors, and ARDRA patterns using double enzyme digestions AluI+MspI were obtained. A matrix of Dice similarity coefficients was calculated and used to compare these restriction patterns. Differences in the community structure due to influent characteristics and temperature could be observed, but not between the oxic and anoxic reactors of each of the three MLE configurations. Other possible applications of ARDRA for detecting and monitoring changes in activated sludge systems are also discussed
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
A t the request of the Iowa State Highway Commission, the Engineering Research Institute observed the traffic operations at the Interstate 29 (1-29) and Interstate 80 (1-80) interchange in the southwest part of Council Bluffs. The general location of the site is shown in Figure 1. Before limiting the analysis to the diverging area the project staff drove the entire Council Bluffs freeway system and consulted with M r . Philip Hassenstab (Iowa State Highway Commission, District 4, Resident Maintenance Engineer at Council Bluffs). The final study scope was delineated as encompassing only the operational characteristics of the diverge area where 1-29 South and 1-80 East divide and the ramp to merge area where 1-80 West joins 1-29 North (both areas being contained within the aforementioned interchange). Supplementing the traffic operations scope, was an effort to delineate and document the applicability of video-tape techniques to traffic engineering studies and analyses. Documentation was primarily in the form of a demonstration video-tape.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
One of the key challenges in the field of nanoparticle (NP) analysis is in producing reliable and reproducible characterisation data for nanomaterials. This study looks at the reproducibility using a relatively new, but rapidly adopted, technique, Nanoparticle Tracking Analysis (NTA) on a range of particle sizes and materials in several different media. It describes the protocol development and presents both the data and analysis of results obtained from 12 laboratories, mostly based in Europe, who are primarily QualityNano members. QualityNano is an EU FP7 funded Research Infrastructure that integrates 28 European analytical and experimental facilities in nanotechnology, medicine and natural sciences with the goal of developing and implementing best practice and quality in all aspects of nanosafety assessment. This study looks at both the development of the protocol and how this leads to highly reproducible results amongst participants. In this study, the parameter being measured is the modal particle size.
Resumo:
CE is a powerful analytical tool used to separate intact biomolecules such as proteins. The coupling of CE with TOF/MS produces a very promising method that can be used to detect and identify proteins in different matrices. This paper describes an efficient, rapid, and simple CE-ESI-TOF/MS procedure for the analysis of endogenous human growth hormone and recombinant human growth hormone without sample preparation. Operational factors were optimized using an experimental design, and the method was successfully applied to distinguish human growth hormone and recombinant human growth hormone in unknown samples.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The purpose of this project is to develop an investment analysis model that integrates the capabilities of four types of analysis for use in evaluating interurban transportation system improvements. The project will also explore the use of new data warehousing and mining techniques to design the types of databases required for supporting such a comprehensive transportation model. The project consists of four phases. The first phase, which is documented in this report, involves development of the conceptual foundation for the model. Prior research is reviewed in Chapter 1, which is composed of three major sections providing demand modeling background information for passenger transportation, transportation of freight (manufactured products and supplies), and transportation of natural resources and agricultural commodities. Material from the literature on geographic information systems makes up Chapter 2. Database models for the national and regional economies and for the transportation and logistics network are conceptualized in Chapter 3. Demand forecasting of transportation service requirements is introduced in Chapter 4, with separate sections for passenger transportation, freight transportation, and transportation of natural resources and commodities. Characteristics and capacities of the different modes, modal choices, and route assignments are discussed in Chapter 5. Chapter 6 concludes with a general discussion of the economic impacts and feedback of multimodal transportation activities and facilities.
Resumo:
Abstract The research problem in the thesis deals with improving the responsiveness and efficiency of logistics service processes between a supplier and its customers. The improvement can be sought by customizing the services and increasing the coordination of activities between the different parties in the supply chain. It is argued that to achieve coordination the parties have to have connections on several levels. In the framework employed in this research, three contexts are conceptualized at which the linkages can be planned: 1) the service policy context, 2) the process coordination context, and 3) the relationship management context. The service policy context consists of the planning methods by which a supplier analyzes its customers' logistics requirements and matches them with its own operational environment and efficiency requirements. The main conclusion related to the service policy context is that it is important to have a balanced selection of both customer-related and supplier-related factors in the analysis. This way, while the operational efficiency is planned a sufficient level of service for the most important customers is assured. This kind of policy planning involves taking multiple variables into the analysis, and there is a need to develop better tools for this purpose. Some new approaches to deal with this are presented in the thesis.The process coordination context and the relationship management context deal with the issues of how the implementation of the planned service policies can be facilitated in an inter-organizational environment. Process coordination includes typically such mechanisms as control rules, standard procedures and programs, but inhighly demanding circumstances more integrative coordination mechanisms may be necessary. In the thesis the coordination problems in third-party logistics relationship are used as an example of such an environment. Relationship management deals with issues of how separate companies organize their relationships to improve the coordination of their common processes. The main implication related to logistics planning is that by integrating further at the relationship level, companies can facilitate the use of the most efficient coordination mechanisms and thereby improve the implementation of the selected logistics service policies. In the thesis, a case of a logistics outsourcing relationship is used to demonstrate the need to address the relationship issues between the service provider andthe service buyer before the outsourcing can be done.The dissertation consists of eight research articles and a summarizing report. The principal emphasis in the articles is on the service policy planning context, which is the main theme of six articles. Coordination and relationship issues are specifically addressed in two of the papers.
Resumo:
OBJECTIVES: This is the first meta-analysis on the efficacy of composite resin restorations in anterior teeth. The objective of the present meta-analysis was to verify whether specific material classes, tooth conditioning methods and operational procedures influence the result for Class III and Class IV restorations. MATERIAL AND METHODS: The database SCOPUS and PubMed were searched for clinical trials on anterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimal number of restorations at last recall=20; (3) report on drop-out rate; (4) report of operative technique and materials used in the trial, and (5) utilization of Ryge or modified Ryge evaluation criteria. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. p-Values smaller than 0.05 were considered to be significant. RESULTS: Of the 84 clinical trials, 21 studies met the inclusion criteria, 14 of them for Class III restorations, 6 for Class IV restorations and 1 for closure of diastemata; the latter was included in the Class IV group. Twelve of the 21 studies started before 1991 and 18 before 2001. The estimated median overall success rate (without replacement) after 10 years for Class III composite resin restorations was 95% and for Class IV restorations 90%. The main reason for the replacement of Class IV restorations was bulk fractures, which occurred significantly more frequently with microfilled composites than with hybrid and macrofilled composites. Caries adjacent to restorations was infrequent in most studies and accounted only for about 2.5% of all replaced restorations after 10 years irrespective of the cavity class. Class III restorations with glass ionomer derivates suffered significantly more loss of anatomical form than did fillings with other types of material. When the enamel was acid-etched and no bonding agent was applied, significantly more restorations showed marginal staining and detectable margins compared to enamel etching with enamel bonding or the total etch technique; fillings with self-etching systems were in between of these two outcome variables. Bevelling of the enamel was associated with a significantly reduced deterioration of the anatomical form compared to no bevelling but not with less marginal staining or less detectable margins. The type of isolation (absolute/relative) had a statistically significant influence on marginal caries which, however, might be a random finding.
Resumo:
The topic of this study is the language of the educational policies of the British Labour party in the General Election manifestos between the years 1983-2005. The twenty-year period studied has been a period of significant changes in world politics, and in British politics, especially for the Labour party. The emergence educational policy as a vote-winner of the manifestos of the nineties has been noteworthy. The aim of the thesis is two-fold: to look at the structure of the political manifesto as an example of genre writing and to analyze the content utilizing the approach of critical discourse analysis. Furthermore, the aim of this study is not to pinpoint policy positions but to look at what is the image that the Labour Party creates of itself through these manifestos. The analysis of the content is done by a method of close-reading. Based on the findings, the methodology for the analysis of the content was created. This study utilized methodological triangulation which means that the material is analyzed from several methodological aspects. The aspects used in this study are ones of lexical features (collocation, coordination, euphemisms, metaphors and naming), grammatical features (thematic roles, tense, aspect, voice and modal auxiliaries) and rhetoric (Burke, Toulmin and Perelman). From the analysis of the content a generic description is built. By looking at the lexical, grammatical and rhetorical features a clear change in language of the Labour Party can be detected. This change is foreshadowed already in the 1992 manifesto but culminates in the 1997 manifesto which would lead Labour to a landslide victory in the General Election. During this twenty-year period Labour has moved away from the old commitments and into the new sphere of “something for everybody”. The pervasiveness of promotional language and market inspired vocabulary into the sphere of manifesto writing is clear. The use of the metaphors seemed to be the tool for the creation of the image of the party represented through the manifestos. A limited generic description can be constructed from the findings based on the content and structure of the manifestos: especially more generic findings such as the use of the exclusive we, the lack of certain anatomical parts of argument structure, the use of the future tense and the present progressive aspect can shed light to the description of the genre of manifesto writing. While this study is only the beginning, it proves that the combination of looking at the lexical, grammatical and rhetorical features in the study of manifestos is a promising one.
Resumo:
This thesis analyses the calculation of FanSave and PumpSave energy saving tools calculation. With these programs energy consumption of variable speed drive control for fans and pumps can be compared to other control methods. With FanSave centrifugal and axial fans can be examined and PumpSave deals with centrifugal pumps. By means of these programs also suitable frequency converter can be chosen from the ABB collection. Programs need as initial values information about the appliances like amount of flow and efficiencies. Operation time is important factor when calculating the annual energy consumption and information about it are the length and profile. Basic theory related to fans and pumps is introduced without more precise instructions for dimensioning. FanSave and PumpSave contain various methods for flow control. These control methods are introduced in the thesis based on their operational principles and suitability. Also squirrel cage motor and frequency converter are introduced because of their close involvement to fans and pumps. Second part of the thesis contains comparison between results of FanSave’s and PumpSave’s calculation and performance curve based calculation. Also laboratory tests were made with centrifugal and axial fan and also with centrifugal pump. With the results from this thesis the calculation of these programs can be adjusted to be more accurate and also some new features can be added.