922 resultados para ASSESSMENT SCALE


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the experimental methods commonly used to define the behaviour of a full scale system, dynamic tests are the most complete and efficient procedures. A dynamic test is an experimental process, which would define a set of characteristic parameters of the dynamic behaviour of the system, such as natural frequencies of the structure, mode shapes and the corresponding modal damping values associated. An assessment of these modal characteristics can be used both to verify the theoretical assumptions of the project, to monitor the performance of the structural system during its operational use. The thesis is structured in the following chapters: The first introductive chapter recalls some basic notions of dynamics of structure, focusing the discussion on the problem of systems with multiply degrees of freedom (MDOF), which can represent a generic real system under study, when it is excited with harmonic force or in free vibration. The second chapter is entirely centred on to the problem of dynamic identification process of a structure, if it is subjected to an experimental test in forced vibrations. It first describes the construction of FRF through classical FFT of the recorded signal. A different method, also in the frequency domain, is subsequently introduced; it allows accurately to compute the FRF using the geometric characteristics of the ellipse that represents the direct input-output comparison. The two methods are compared and then the attention is focused on some advantages of the proposed methodology. The third chapter focuses on the study of real structures when they are subjected to experimental test, where the force is not known, like in an ambient or impact test. In this analysis we decided to use the CWT, which allows a simultaneous investigation in the time and frequency domain of a generic signal x(t). The CWT is first introduced to process free oscillations, with excellent results both in terms of frequencies, dampings and vibration modes. The application in the case of ambient vibrations defines accurate modal parameters of the system, although on the damping some important observations should be made. The fourth chapter is still on the problem of post processing data acquired after a vibration test, but this time through the application of discrete wavelet transform (DWT). In the first part the results obtained by the DWT are compared with those obtained by the application of CWT. Particular attention is given to the use of DWT as a tool for filtering the recorded signal, in fact in case of ambient vibrations the signals are often affected by the presence of a significant level of noise. The fifth chapter focuses on another important aspect of the identification process: the model updating. In this chapter, starting from the modal parameters obtained from some environmental vibration tests, performed by the University of Porto in 2008 and the University of Sheffild on the Humber Bridge in England, a FE model of the bridge is defined, in order to define what type of model is able to capture more accurately the real dynamic behaviour of the bridge. The sixth chapter outlines the necessary conclusions of the presented research. They concern the application of a method in the frequency domain in order to evaluate the modal parameters of a structure and its advantages, the advantages in applying a procedure based on the use of wavelet transforms in the process of identification in tests with unknown input and finally the problem of 3D modeling of systems with many degrees of freedom and with different types of uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bioinformatics, in the last few decades, has played a fundamental role to give sense to the huge amount of data produced. Obtained the complete sequence of a genome, the major problem of knowing as much as possible of its coding regions, is crucial. Protein sequence annotation is challenging and, due to the size of the problem, only computational approaches can provide a feasible solution. As it has been recently pointed out by the Critical Assessment of Function Annotations (CAFA), most accurate methods are those based on the transfer-by-homology approach and the most incisive contribution is given by cross-genome comparisons. In the present thesis it is described a non-hierarchical sequence clustering method for protein automatic large-scale annotation, called “The Bologna Annotation Resource Plus” (BAR+). The method is based on an all-against-all alignment of more than 13 millions protein sequences characterized by a very stringent metric. BAR+ can safely transfer functional features (Gene Ontology and Pfam terms) inside clusters by means of a statistical validation, even in the case of multi-domain proteins. Within BAR+ clusters it is also possible to transfer the three dimensional structure (when a template is available). This is possible by the way of cluster-specific HMM profiles that can be used to calculate reliable template-to-target alignments even in the case of distantly related proteins (sequence identity < 30%). Other BAR+ based applications have been developed during my doctorate including the prediction of Magnesium binding sites in human proteins, the ABC transporters superfamily classification and the functional prediction (GO terms) of the CAFA targets. Remarkably, in the CAFA assessment, BAR+ placed among the ten most accurate methods. At present, as a web server for the functional and structural protein sequence annotation, BAR+ is freely available at http://bar.biocomp.unibo.it/bar2.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La Tesi analizza le relazioni tra i processi di sviluppo agricolo e l’uso delle risorse naturali, in particolare di quelle energetiche, a livello internazionale (paesi in via di sviluppo e sviluppati), nazionale (Italia), regionale (Emilia Romagna) e aziendale, con lo scopo di valutare l’eco-efficienza dei processi di sviluppo agricolo, la sua evoluzione nel tempo e le principali dinamiche in relazione anche ai problemi di dipendenza dalle risorse fossili, della sicurezza alimentare, della sostituzione tra superfici agricole dedicate all’alimentazione umana ed animale. Per i due casi studio a livello macroeconomico è stata adottata la metodologia denominata “SUMMA” SUstainability Multi-method, multi-scale Assessment (Ulgiati et al., 2006), che integra una serie di categorie d’impatto dell’analisi del ciclo di vita, LCA, valutazioni costi-benefici e la prospettiva di analisi globale della contabilità emergetica. L’analisi su larga scala è stata ulteriormente arricchita da un caso studio sulla scala locale, di una fattoria produttrice di latte e di energia elettrica rinnovabile (fotovoltaico e biogas). Lo studio condotto mediante LCA e valutazione contingente ha valutato gli effetti ambientali, economici e sociali di scenari di riduzione della dipendenza dalle fonti fossili. I casi studio a livello macroeconomico dimostrano che, nonostante le politiche di supporto all’aumento di efficienza e a forme di produzione “verdi”, l’agricoltura a livello globale continua ad evolvere con un aumento della sua dipendenza dalle fonti energetiche fossili. I primi effetti delle politiche agricole comunitarie verso una maggiore sostenibilità sembrano tuttavia intravedersi per i Paesi Europei. Nel complesso la energy footprint si mantiene alta poiché la meccanizzazione continua dei processi agricoli deve necessariamente attingere da fonti energetiche sostitutive al lavoro umano. Le terre agricole diminuiscono nei paesi europei analizzati e in Italia aumentando i rischi d’insicurezza alimentare giacché la popolazione nazionale sta invece aumentando.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental decay in porous masonry materials, such as brick and mortar, is a widespread problem concerning both new and historic masonry structures. The decay mechanisms are quite complex dependng upon several interconnected parameters and from the interaction with the specific micro-climate. Materials undergo aesthetical and substantial changes in character but while many studies have been carried out, the mechanical aspect has been largely understudied while it bears true importance from the structural viewpoint. A quantitative assessment of the masonry material degradation and how it affects the load-bearing capacity of masonry structures appears missing. The research work carried out, limiting the attention to brick masonry addresses this issue through an experimental laboratory approach via different integrated testing procedures, both non-destructive and mechanical, together with monitoring methods. Attention was focused on transport of moisture and salts and on the damaging effects caused by the crystallization of two different salts, sodium chloride and sodium sulphate. Many series of masonry specimens, very different in size and purposes were used to track the damage process since its beginning and to monitor its evolution over a number of years Athe same time suitable testing techniques, non-destructive, mini-invasive, analytical, of monitoring, were validated for these purposes. The specimens were exposed to different aggressive agents (in terms of type of salt, of brine concentration, of artificial vs. open-air natural ageing, …), tested by different means (qualitative vs. quantitative, non destructive vs. mechanical testing, punctual vs. wide areas, …), and had different size (1-, 2-, 3-header thick walls, full-scale walls vs. small size specimens, brick columns and triplets vs. small walls, masonry specimens vs. single units of brick and mortar prisms, …). Different advanced testing methods and novel monitoring techniques were applied in an integrated holistic approach, for quantitative assessment of masonry health state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban centers significantly contribute to anthropogenic air pollution, although they cover only a minor fraction of the Earth's land surface. Since the worldwide degree of urbanization is steadily increasing, the anthropogenic contribution to air pollution from urban centers is expected to become more substantial in future air quality assessments. The main objective of this thesis was to obtain a more profound insight in the dispersion and the deposition of aerosol particles from 46 individual major population centers (MPCs) as well as the regional and global influence on the atmospheric distribution of several aerosol types. For the first time, this was assessed in one model framework, for which the global model EMAC was applied with different representations of aerosol particles. First, in an approach with passive tracers and a setup in which the results depend only on the source location and the size and the solubility of the tracers, several metrics and a regional climate classification were used to quantify the major outflow pathways, both vertically and horizontally, and to compare the balance between pollution export away from and pollution build-up around the source points. Then in a more comprehensive approach, the anthropogenic emissions of key trace species were changed at the MPC locations to determine the cumulative impact of the MPC emissions on the atmospheric aerosol burdens of black carbon, particulate organic matter, sulfate, and nitrate. Ten different mono-modal passive aerosol tracers were continuously released at the same constant rate at each emission point. The results clearly showed that on average about five times more mass is advected quasi-horizontally at low levels than exported into the upper troposphere. The strength of the low-level export is mainly determined by the location of the source, while the vertical transport is mainly governed by the lifting potential and the solubility of the tracers. Similar to insoluble gas phase tracers, the low-level export of aerosol tracers is strongest at middle and high latitudes, while the regions of strongest vertical export differ between aerosol (temperate winter dry) and gas phase (tropics) tracers. The emitted mass fraction that is kept around MPCs is largest in regions where aerosol tracers have short lifetimes; this mass is also critical for assessing the impact on humans. However, the number of people who live in a strongly polluted region around urban centers depends more on the population density than on the size of the area which is affected by strong air pollution. Another major result was that fine aerosol particles (diameters smaller than 2.5 micrometer) from MPCs undergo substantial long-range transport, with about half of the emitted mass being deposited beyond 1000 km away from the source. In contrast to this diluted remote deposition, there are areas around the MPCs which experience high deposition rates, especially in regions which are frequently affected by heavy precipitation or are situated in poorly ventilated locations. Moreover, most MPC aerosol emissions are removed over land surfaces. In particular, forests experience more deposition from MPC pollutants than other land ecosystems. In addition, it was found that the generic treatment of aerosols has no substantial influence on the major conclusions drawn in this thesis. Moreover, in the more comprehensive approach, it was found that emissions of black carbon, particulate organic matter, sulfur dioxide, and nitrogen oxides from MPCs influence the atmospheric burden of various aerosol types very differently, with impacts generally being larger for secondary species, sulfate and nitrate, than for primary species, black carbon and particulate organic matter. While the changes in the burdens of sulfate, black carbon, and particulate organic matter show an almost linear response for changes in the emission strength, the formation of nitrate was found to be contingent upon many more factors, e.g., the abundance of sulfuric acid, than only upon the strength of the nitrogen oxide emissions. The generic tracer experiments were further extended to conduct the first risk assessment to obtain the cumulative risk of contamination from multiple nuclear reactor accidents on the global scale. For this, many factors had to be taken into account: the probability of major accidents, the cumulative deposition field of the radionuclide cesium-137, and a threshold value that defines contamination. By collecting the necessary data and after accounting for uncertainties, it was found that the risk is highest in western Europe, the eastern US, and in Japan, where on average contamination by major accidents is expected about every 50 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coastal flooding poses serious threats to coastal areas around the world, billions of dollars in damage to property and infrastructure, and threatens the lives of millions of people. Therefore, disaster management and risk assessment aims at detecting vulnerability and capacities in order to reduce coastal flood disaster risk. In particular, non-specialized researchers, emergency management personnel, and land use planners require an accurate, inexpensive method to determine and map risk associated with storm surge events and long-term sea level rise associated with climate change. This study contributes to the spatially evaluation and mapping of social-economic-environmental vulnerability and risk at sub-national scale through the development of appropriate tools and methods successfully embedded in a Web-GIS Decision Support System. A new set of raster-based models were studied and developed in order to be easily implemented in the Web-GIS framework with the purpose to quickly assess and map flood hazards characteristics, damage and vulnerability in a Multi-criteria approach. The Web-GIS DSS is developed recurring to open source software and programming language and its main peculiarity is to be available and usable by coastal managers and land use planners without requiring high scientific background in hydraulic engineering. The effectiveness of the system in the coastal risk assessment is evaluated trough its application to a real case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing need to assess the environmental status of the Mediterranean coastal marine habitats and the large availability of data collected by Reef Check Italia onlus (RCI) volunteers suggest the possibility to develop innovative and reliable indices that may support decision makers in applying conservation strategies. The aims of this study were to check the reliability of data collected by RCI volunteers, analyse the spatial and temporal distribution of RCI available data, resume the knowledge on the biology and ecology of the monitored species, and develop innovative indices to asses the ecological quality of Mediterranean subtidal rocky shores and coralligenous habitats. Subtidal rocky shores and coralligenous were chosen because these are the habitats more attractive for divers; therefore mlst data are referring to them, moreover subtidal rocky bottom are strongly affected by coastal urbanisation, land use, fishing and tourist activities, that increase pollution, turbidity and sedimentation. Non-indigenous species (NIS) have been recognized as a major threat to the integrity of Mediterranean native communities because of their proliferation, spread and impact on resident communities. Monitoring of NIS’ spreading dynamics at the basin spatial scale is difficult but urgent. According to a field test, the training provided by RCI appears adequate to obtain reliable data by volunteers. Based on data collected by RCI volunteers, three main categories of indices were developed: indices based on species diversity, indices on the occurrence non-indigenous species, and indices on species sensitive toward physical, chemical and biological disturbances. As case studies, indices were applied to stretches of coastline defined according to management criteria (province territories and marine protected areas). The assessments of ecological quality in the Tavolara Marine Protected Area using the species sensitivities index were consisten with those previously obtained with traditional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction The survival of patients admitted to an emergency department is determined by the severity of acute illness and the quality of care provided. The high number and the wide spectrum of severity of illness of admitted patients make an immediate assessment of all patients unrealistic. The aim of this study is to evaluate a scoring system based on readily available physiological parameters immediately after admission to an emergency department (ED) for the purpose of identification of at-risk patients. Methods This prospective observational cohort study includes 4,388 consecutive adult patients admitted via the ED of a 960-bed tertiary referral hospital over a period of six months. Occurrence of each of seven potential vital sign abnormalities (threat to airway, abnormal respiratory rate, oxygen saturation, systolic blood pressure, heart rate, low Glasgow Coma Scale and seizures) was collected and added up to generate the vital sign score (VSS). VSSinitial was defined as the VSS in the first 15 minutes after admission, VSSmax as the maximum VSS throughout the stay in ED. Occurrence of single vital sign abnormalities in the first 15 minutes and VSSinitial and VSSmax were evaluated as potential predictors of hospital mortality. Results Logistic regression analysis identified all evaluated single vital sign abnormalities except seizures and abnormal respiratory rate to be independent predictors of hospital mortality. Increasing VSSinitial and VSSmax were significantly correlated to hospital mortality (odds ratio (OR) 2.80, 95% confidence interval (CI) 2.50 to 3.14, P < 0.0001 for VSSinitial; OR 2.36, 95% CI 2.15 to 2.60, P < 0.0001 for VSSmax). The predictive power of VSS was highest if collected in the first 15 minutes after ED admission (log rank Chi-square 468.1, P < 0.0001 for VSSinitial;,log rank Chi square 361.5, P < 0.0001 for VSSmax). Conclusions Vital sign abnormalities and VSS collected in the first minutes after ED admission can identify patients at risk of an unfavourable outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study was to examine the economic performance as well as perceived social and environmental impacts of organic cotton in Southern Kyrgyzstan on the basis of a comparative field study (44 certified organic farmers and 33 conventional farmers) carried out in 2009. It also investigated farmers’ motivation for and assessment of conversion to organic farming. Cotton yields on organic farms were found to be 10% lower whereby input costs per unit were 42% lower, which resulted in organic farmers having a 20% higher revenue from cotton. Due to lower input costs and organic and fair trade price premiums the average gross margin from organic cotton was 27%. In addition to direct economic benefits organic farmers enjoy a number of additional benefits such as easy access to credits on favourable terms, provision with uncontaminated cotton cooking oil and seed cake as animal feed, marketing support as well as extension and training, services provided by the newly established organic service provider. A big majority of organic farmers perceives an improvement of soil qualities, improved health conditions, and positively assesses their previous decision to convert to organic farming. The major disadvantage of organic farming is the high manual labour input required. In the study area, where manual farm work is mainly women’s work and male labour migration widespread, women are most affected by this negative aspect of organic farming. Altogether, the results suggest that despite the inconvenience of higher work load the advantages of organic farming outweigh the disadvantages and that conversion to organic farming can improve the livelihoods of small-scale farmers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although sustainable land management (SLM) is widely promoted to prevent and mitigate land degradation and desertification, its monitoring and assessment (M&A) has received much less attention. This paper compiles methodological approaches which to date have been little reported in the literature. It draws lessons from these experiences and identifies common elements and future pathways as a basis for a global approach. The paper starts with local level methods where the World Overview of Conservation Approaches and Technologies (WOCAT) framework catalogues SLM case studies. This tool has been included in the local level assessment of Land Degradation Assessment in Drylands (LADA) and in the EU-DESIRE project. Complementary site-based approaches can enhance an ecological process-based understanding of SLM variation. At national and sub-national levels, a joint WOCAT/LADA/DESIRE spatial assessment based on land use systems identifies the status and trends of degradation and SLM, including causes, drivers and impacts on ecosystem services. Expert consultation is combined with scientific evidence and enhanced where necessary with secondary data and indicator databases. At the global level, the Global Environment Facility (GEF) knowledge from the land (KM:Land) initiative uses indicators to demonstrate impacts of SLM investments. Key lessons learnt include the need for a multi-scale approach, making use of common indicators and a variety of information sources, including scientific data and local knowledge through participatory methods. Methodological consistencies allow cross-scale analyses, and findings are analysed and documented for use by decision-makers at various levels. Effective M&A of SLM [e.g. for United Nations Convention to Combat Desertification (UNCCD)] requires a comprehensive methodological framework agreed by the major players.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biotic and abiotic phenological observations can be collected from continental to local spatial scale. Plant phenological observations may only be recorded wherever there is vegetation. Fog, snow and ice are available as phenological para-meters wherever they appear. The singularity of phenological observations is the possibility of spatial intensification to a microclimatic scale where the equipment of meteorological measurements is too expensive for intensive campaigning. The omnipresence of region-specific phenological parameters allows monitoring for a spatially much more detailed assessment of climate change than with weather data. We demonstrate this concept with phenological observations with the use of a special network in the Canton of Berne, Switzerland, with up to 600 observations sites (more than 1 to 10 km² of the inhabited area). Classic cartography, gridding, the integration into a Geographic Information System GIS and large-scale analysis are the steps to a detailed knowledge of topoclimatic conditions of a mountainous area. Examples of urban phenology provide other types of spatially detailed applications. Large potential in phenological mapping in future analyses lies in combining traditionally observed species-specific phenology with remotely sensed and modelled phenology that provide strong spatial information. This is a long history from cartographic intuition to algorithm-based representations of phenology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

REASONS FOR PERFORMING STUDY: Efficacy of medications for recurrent airway obstruction is typically tested using clinical, cytological and lung function examinations of severely affected animals. These trials are technically challenging and may not adequately reflect the spectrum of disease and owner complaints encountered in clinical practice. OBJECTIVE: To determine if owners of horses with chronic airway disease are better able to detect drug efficacy than a veterinarian who clinically examines horses infrequently. METHOD: In a double-blinded randomised controlled trial, owners and a veterinarian compared the efficacy of dexamethasone (0.1 mg/kg bwt per os, q. 24 h, for 3 weeks; n = 9) to placebo (n = 8) in horses with chronic airway disease. Before and after treatment, owners scored performance, breathing effort, coughing and nasal discharge using a visual analogue scale (VAS). The clinician recorded vital parameters, respiratory distress, auscultation findings, cough and nasal discharge, airway mucus score, bronchoalveolar lavage fluid (BALF) cytology and arterial blood gases. RESULTS: The VAS score improved significantly in dexamethasone- but not placebo-treated horses. In contrast, the clinician failed to differentiate between dexamethasone- and placebo-treated animals based on clinical observations, BALF cytology or endoscopic mucus score. Respiratory rate (RR) and arterial oxygen pressure (PaO(2)) improved with dexamethasone but not placebo. CONCLUSIONS AND CLINICAL RELEVANCE: In the design of clinical trials of airway disease treatments, more emphasis should be placed on owner-assessed VAS than on clinical, cytological and endoscopic observations made during brief examinations by a veterinarian. Quantifiable indicators reflecting lung function such as RR and PaO(2) provide a good assessment of drug efficacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Purpose—There is some controversy on the association of the National Institutes of Health Stroke Scale (NIHSS) score to predict arterial occlusion on MR arteriography and CT arteriography in acute stroke. Methods—We analyzed NIHSS scores and arteriographic findings in 2152 patients (35.4% women, mean age 66±14 years) with acute anterior or posterior circulation strokes. Results—The study included 1603 patients examined with MR arteriography and 549 with CT arteriography. Of those, 1043 patients (48.5%; median NIHSS score 5, median time to clinical assessment 179 minutes) showed an occlusion, 887 in the anterior (median NIHSS score 7/0–31), and 156 in the posterior circulation (median NIHSS score 3/0–32). Eight hundred sixty visualized occlusions (82.5%) were located centrally (ie, in the basilar, intracranial vertebral, internal carotid artery, or M1/M2 segment of the middle cerebral artery). NIHSS scores turned out to be predictive for any vessel occlusions in the anterior circulation. Best cut-off values within 3 hours after symptom onset were NIHSS scores ≥9 (positive predictive value 86.4%) and NIHSS scores ≥7 within >3 to 6 hours (positive predictive value 84.4%). Patients with central occlusions presenting within 3 hours had NIHSS scores <4 in only 5%. In the posterior circulation and in patients presenting after 6 hours, the predictive value of the NIHSS score for vessel occlusion was poor. Conclusions—There is a significant association of NIHSS scores and vessel occlusions in patients with anterior circulation strokes. This association is best within the first hours after symptom onset. Thereafter and in the posterior circulation the association is poor.