995 resultados para Standard fire curve
Resumo:
A practical guide is given to help aquaculture researchers identify and correct common problems associated with the colorimetric analysis of water. Hints in making standard solutions, choosing standard concentrations for making a standard curve and making measurements are included. Various types of standard curves and some problems are outlined and details provided regarding the evaluation of standard curves.
Resumo:
This paper presents a comparison of fire field model predictions with experiment for the case of a fire within a compartment which is vented (buoyancydriven) to the outside by a single horizontal ceiling vent. Unlike previous work, the mathematical model does not employ a mixing ratio to represent vent temperatures but allows the model to predict vent temperatures a priori. The experiment suggests that the flow through the vent produces oscillatory behaviour in vent temperatures with puffs of smoke emerging from the fire compartment. This type of flow is also predicted by the fire field model. While the numerical predictions are in good qualitative agreement with observations, they overpredict the amplitudes of the temperature oscillations within the vent and also the compartment temperatures. The discrepancies are thought to be due to three-dimensional effects not accounted for in this model as well as using standard ‘practices’ normally used by the community with regards to discretization and turbulence models. Furthermore, it is important to note that the use of the k–ε turbulence model in a transient mode, as is used here, may have a significant effect on the results. The numerical results also suggest that a linear relationship exists between the frequency of vent temperature oscillation (n) and the heat release rate (Q0) of the type n∝Q0.290, similar to that observed for compartments with two horizontal vents. This relationship is predicted to occur only for heat release rates below a critical value. Furthermore, the vent discharge coefficient is found to vary in an oscillatory fashion with a mean value of 0.58. Below the critical heat release rate the mean discharge coefficient is found to be insensitive to fire size.
Resumo:
This presentation will attempt to address the issue of whether the engineering design community has the knowledge, data and tool sets required to undertake advanced evacuation analysis. In discussing this issue I want to draw on examples not only from the building industry but more widely from where ever people come into contact with an environment fashioned by man. Prescriptive design regulations the world over suggest that if we follow a particular set of essentially configurational regulations concerning travel distances, number of exits, exit widths, etc it should be possible to evacuate a structure within a pre-defined acceptable amount of time. In the U.K. for public buildings this turns out to be 2.5 minutes, internationally in the aviation industry this is 90 seconds, in the UK rail industry this is 90 seconds and the international standard adopted by the maritime industry is 60 minutes. The difficulties and short comings of this approach are well known and so I will not repeat them here, save to say that this approach is usually littered with “magic numbers” that do not stand up to scrutiny. As we are focusing on human behaviour issues, it is also worth noting that more generally, the approach fails to take into account how people actually behave, preferring to adopt an engineer’s view of what people should do in order to make their design work. Examples of the failure of this approach are legion and include the; Manchester Boeing 737 fire, Kings Cross underground station fire, Piper Alpha oil platform explosion, Ladbroke Grove Rail crash and fire, Mont Blanc tunnel fire, Scandinavian Star ferry fire and the Station Nightclub fire.
Resumo:
A robust method for fitting to the results of gel electrophoresis assays of damage to plasmid DNA caused by radiation is presented. This method makes use of nonlinear regression to fit analytically derived dose response curves to observations of the supercoiled, open circular and linear plasmid forms simultaneously, allowing for more accurate results than fitting to individual forms. Comparisons with a commonly used analysis method show that while there is a relatively small benefit between the methods for data sets with small errors, the parameters generated by this method remain much more closely distributed around the true value in the face of increasing measurement uncertainties. This allows for parameters to be specified with greater confidence, reflected in a reduction of errors on fitted parameters. On test data sets, fitted uncertainties were reduced by 30%, similar to the improvement that would be offered by moving from triplicate to fivefold repeats (assuming standard errors). This method has been implemented in a popular spreadsheet package and made available online to improve its accessibility. (C) 2011 by Radiation Research Society
Resumo:
Purpose: To compare the diagnostic abilities of the standard bracketing strategy (BR) and a fast strategy, the tendency-oriented perimetry (TOP). Methods: Seventy-seven controls and 91 eyes from patients with glaucoma were analyzed with the strategies TOP and BR. Sensitivity (Se), specificity (Sp), the area under the receiver operating characteristic (ROC) curve (AC) and the optimum cutoff value (CO) were calculated for the visual field indices mean defect (MD), the square root of the loss variance (sLV) and the number of pathological points (NPP). Results: In the glaucoma group, the mean MD value using TOP and BR was 7.5 and 8.3 dB, respectively. The mean sLV value using TOP and BR was 5.0 and 5.3 dB, respectively. Indices provided by TOP had higher ROC values than the ones provided by BR. Using TOP, the index with the best diagnostic ability was sLV (Sp = 94.8, Se = 90.1, AC = 0.966, CO = 2.5 dB), followed by NPP and MD. Using BR, the best results were obtained for MD (Sp = 92.2, Se = 81.3, AC = 0.900, CO = 2.5 dB) followed by sLV and NPP. Conclusions: A fast strategy, TOP, had superior diagnostic ability than the standard BR. Although TOP provided lower LV values than BR, the diagnostic ability of this index was higher than that of the conventional strategy. Copyright © 2005 S. Karger AG.
Resumo:
Every year, particularly during the summer period, the Portuguese forests are devastated by forest fire that destroys their ecosystems. So in order to prevent these forest fires, public and private authorities frequently use methods for the reduction of combustible mass as the prescribed fire and the mechanical vegetation pruning. All of these methods of prevention of forest fires alter the vegetation layer and/or soil [1-2]. This work aimed the study of the variation of some chemical characteristics of soil that suffered prescribed fire. The studied an area was located in the Serra of Cabreira (Figure 1) with 54.6 ha. Twenty sampling points were randomly selected and samples were collected with a shovel before, just after the prescribed fire, and 125 and 196 days after that event. The parameters that were studied were: pH, soil moisture, organic matter and iron, magnesium and potassium total concentration. All the analysis followed International Standard Methodologies. This work allowed to conclude that: a) after the prescribed fire; i) the pH remained practically equal to the the initial value; ii) occurred a slight increase of the average of the organic matter contents and iron total contents; b) at the end of the sampling period compared to the initial values; i) the pH didn´t change significantly; ii) the average of the contents of organic matter decreased; and iii) the average of the total contents of Fe, Mg and K increased.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
1. A method for obtaining the end-systolic left ventricular (LV) pressure-diameter and stress-diameter relationships in man was critically analyzed.2. Pressure-diameter and stress-diameter relationships were determined throughout the cardiac cycle by combining standard LV manometry with M-mode echocardiography. Nine adult patients with heart disease and without heart failure were studied during intracardiac catheterization under three different conditions of arterial pressure, i.e., basal (B) condition (mean +/- SD systolic pressure, 102 +/- 10 mmHg) and two stable states of arterial hypertension (H(I), 121 +/- 12 mmHg; H(II), 147 +/- 17 mmHg) induced by venous infusion of phenylephrine after parasympathetic autonomic blockade with 0.04 mg/kg atropine.3. Significant reflex heart rate variation with arterial hypertension was observed (B, 115 +/- 20 bpm; H(I), 103 +/- 14 bpm; H(II), 101 +/- 13 bpm) in spite of the parasympathetic blockade with atropine. The linear end-systolic pressure-diameter and stress-diameter relationships ranged from 53.0 to 160.0 mmHg/cm and from 97.0 to 195.0 g/cm3, respectively.4. The end-systolic LV pressure-diameter and stress-diameter relationship lines presented high and variable slopes. The slopes, which are indicators of myocardial contractility, are susceptible to modifications by small deviations in the measurement of the ventricular diameter or by delay in the pressure curve recording.
Resumo:
Bismuth was evaluated as an internal standard for the direct determination of Pb in vinegar by graphite furnace atomic absorption spectrometry using Ru as a permanent modifier with co-injection of Pd/Mg(NO3)(2). The correlation coefficient of the graph plotted from the non-nalized absorbance signals of Bi versus Pb was r=0.989. Matrix effects were evaluated by analyzing the slope ratios between the analytical curve, and analytical curves obtained from Pb additions in red and white wine vinegar obtained from reference solutions prepared in 0.2% (v/v) HNO3, samples. The calculated ratios were around 1.04 and 1.02 for analytical curves established applying an internal standard and 1.3 and 1.5 for analvtical curves without. Analytical curves in the 2.5-15 pg L-1 Pb concentration interval were established using the ratio Pb absorbance to Bi absorbance versus analvte concentration, and typical linear correlations of r=0.999 were obtained. The proposed method was applied for direct determination of Pb in 18 commercial vinegar samples and the Pb concentration varied from 2.6 to 31 pg L-1. Results were in agreement at a 95% confidence level (paired t-test) with those obtained for digested samples. Recoveries of Pb added to vinegars varied from 96 to 108% with and from 72 to 86% without an internal standard. Two water standard reference materials diluted in vinegar sample were also analyzed and results were in agreement with certified values at a 95% confidence level. The characteristic mass was 40 pg Pb and the useful lifetime of the tube was around 1600 firings. The limit of detection was 0.3 mu g L-1 and the relative standard deviation was <= 3.8% and <= 8.3% (n = 12) for a sample containing, 10 mu L-1 Pb with and without internal standard, respectively. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Com o objetivo de ajustar modelos não-lineares, foram utilizados registros mensais do peso de 10 fêmeas de cateto (Pecari tajacu) coletados durante dois anos, no criatório do campo experimental Álvaro Adolfo da Embrapa Amazônia Oriental, Belém, PA. Utilizaram-se os modelos de Von Bertalanffy, Brody, Gompertz e Logístico. Os parâmetros foram estimados usando o procedimento NLIN do aplicativo SAS. Os critérios utilizados para verificar o ajuste dos modelos foram: desvio padrão assintótico (ASD); coeficiente de determinação (R2); desvio médio absoluto dos resíduos (ARD) e o índice assintótico (AR). Os modelos Brody e Logístico estimaram, respectivamente, o maior (19,44kg) e o menor (19,18kg) peso assintótico (A), caracterizando a menor (0,0064kg/dia) e a maior (0,0113kg/dia) taxa de maturação (K), haja vista a natureza antagônica entre estes parâmetros, comprovada pela correlação fenotípica variando entre -0,75 à -0,47. O modelo Brody estimou o menor valor para o ARD, fator limitante para caracterizar o menor valor para o AR por este modelo. Considerando o AR, o modelo Brody apresentou o melhor ajuste, contudo, pelos valores encontrados, os demais modelos também apresentaram ajuste adequando aos dados ponderais da referida espécie/sexo. Com base no AR adotado neste trabalho, recomenda-se o modelo Brody para ajustar a curva de crescimento de fêmeas de cateto (Pecari tajacu). Em razão dos valores estimados, sobretudo, para a K, essa característica pode ser incluída em um índice de seleção. Contudo, estudos com grupos mais representativos e criados em outras condições se faz oportuno.
Resumo:
This paper presents the design of a high-speed coprocessor for Elliptic Curve Cryptography over binary Galois Field (ECC- GF(2m)). The purpose of our coprocessor is to accelerate the scalar multiplication performed over elliptic curve points represented by affine coordinates in polynomial basis. Our method consists of using elliptic curve parameters over GF(2163) in accordance with international security requirements to implement a bit-parallel coprocessor on field-programmable gate-array (FPGA). Our coprocessor performs modular inversion by using a process based on the Stein's algorithm. Results are presented and compared to results of other related works. We conclude that our coprocessor is suitable for comparing with any other ECC-hardware proposal, since its speed is comparable to projective coordinate designs.
Resumo:
Negli ultimi anni la ricerca nella cura dei tumori si è interessata allo sviluppo di farmaci che contrastano la formazione di nuovi vasi sanguigni (angiogenesi) per l’apporto di ossigeno e nutrienti ai tessuti tumorali, necessari per l’accrescimento e la sopravvivenza del tumore. Per valutare l’efficacia di questi farmaci antiangiogenesi esistono tecniche invasive: viene prelevato tramite biopsia un campione di tessuto tumorale, e tramite analisi microscopica si quantifica la densità microvascolare (numero di vasi per mm^2) del campione. Stanno però prendendo piede tecniche di imaging in grado di valutare l’effetto di tali terapie in maniera meno invasiva. Grazie allo sviluppo tecnologico raggiunto negli ultimi anni, la tomografia computerizzata è tra le tecniche di imaging più utilizzate per questo scopo, essendo in grado di offrire un’alta risoluzione sia spaziale che temporale. Viene utilizzata la tomografia computerizzata per quantificare la perfusione di un mezzo di contrasto all’interno delle lesioni tumorali, acquisendo scansioni ripetute con breve intervallo di tempo sul volume della lesione, a seguito dell’iniezione del mezzo di contrasto. Dalle immagini ottenute vengono calcolati i parametri perfusionali tramite l’utilizzo di differenti modelli matematici proposti in letteratura, implementati in software commerciali o sviluppati da gruppi di ricerca. Al momento manca un standard per il protocollo di acquisizione e per l’elaborazione delle immagini. Ciò ha portato ad una scarsa riproducibilità dei risultati intra ed interpaziente. Manca inoltre in letteratura uno studio sull’affidabilità dei parametri perfusionali calcolati. Il Computer Vision Group dell’Università di Bologna ha sviluppato un’interfaccia grafica che, oltre al calcolo dei parametri perfusionali, permette anche di ottenere degli indici sulla qualità dei parametri stessi. Questa tesi, tramite l’analisi delle curve tempo concentrazione, si propone di studiare tali indici, di valutare come differenti valori di questi indicatori si riflettano in particolari pattern delle curve tempo concentrazione, in modo da identificare la presenza o meno di artefatti nelle immagini tomografiche che portano ad un’errata stima dei parametri perfusionali. Inoltre, tramite l’analisi delle mappe colorimetriche dei diversi indici di errore si vogliono identificare le regioni delle lesioni dove il calcolo della perfusione risulta più o meno accurato. Successivamente si passa all’analisi delle elaborazioni effettuate con tale interfaccia su diversi studi perfusionali, tra cui uno studio di follow-up, e al confronto con le informazioni che si ottengono dalla PET in modo da mettere in luce l’utilità che ha in ambito clinico l’analisi perfusionale. L’intero lavoro è stato svolto su esami di tomografia computerizzata perfusionale di tumori ai polmoni, eseguiti presso l’Unità Operativa di Diagnostica per Immagini dell’IRST (Istituto Scientifico Romagnolo per lo Studio e la Cura dei Tumori) di Meldola (FC). Grazie alla collaborazione in atto tra il Computer Vision Group e l’IRST, è stato possibile sottoporre i risultati ottenuti al primario dell’U. O. di Diagnostica per Immagini, in modo da poterli confrontare con le considerazioni di natura clinica.
Resumo:
Questo lavoro si pone come obiettivo l'approfondimento della natura e delle proprietà dei polinomi espressi mediante la base di Bernstein. Introdotti originariamente all'inizio del '900 per risolvere il problema di approssimare una funzione continua su un intervallo chiuso e limitato della retta reale (Teorema di Stone-Weierstrass), essi hanno riscosso grande successo solo a partire dagli anni '60 quando furono applicati alla computer-grafica per costruire le cosiddette curve di Bezier. Queste, ereditando le loro proprietà geometriche da quelle analitiche dei polinomi di Bernstein, risultano intuitive e facilmente modellabili da un software interattivo e sono alla base di tutti i più moderni disegni curvilinei: dal design industriale, ai sistemi CAD, dallo standard SVG alla rappresentazione di font di caratteri.