940 resultados para flood frequency evaluation
Resumo:
Objectives: The purpose of this study was to evaluate the effectiveness of the Danger Rangers Fire Safety Curriculum in increasing the fire safety knowledge of low-income, minority children in pre-kindergarten to third grade in Austin, TX during a summer day camp in 2007.^ Methods: Data was collected from child participants via teacher and researcher administered tests at pretest, posttest (immediately after the completion of the fire safety module), and at a 3 week follow-up to asses retention. In addition, a self-administered questionnaire was collected from parents pre- and post-intervention to assess home-related fire/burn risk factors. Paired t-tests were conducted using STATA 12.0 to evaluate pretest, posttest, and retention test mean scores as well as mean fire safety rules listed by grade group. McNemar's test was used to determine if there was a difference in fire-related risk factors as reported by the parents of the participants before and after the intervention. Only those who had paired data for the tests/surveys being compared were included in the analysis.^ Results: The first/second grade group and the third grade group scored significantly higher on fire safety knowledge on the posttest compared to the pretest (p<0.0001 for both groups). However, there was no significant change in knowledge scores for the pre-kindergarten to kindergarten group (p=0.14). Among the first/second grade group, knowledge levels did not significantly decline between the posttest and retention test (p=0.25). However, the third grade group had significantly lower fire safety knowledge scores on the retention test compared to the posttest (p<0.001). A similar increase was seen in the amount of fire safety rules listed after the intervention (p<0.0001 between pre and posttest for both the first/second grade and third grade groups), with no decline from the posttest to the retention test (p=0.50) for the first/second grade group, but a significant decline in the third grade group (p=0.001). McNemar's chi-square test showed a significant increase in the percentage of participants' parents reporting smoke detector testing on a regular basis and having a fire escape plan for their family after the intervention (p=0.01 and p<0.0001, respectively). However, there was no significant change in the frequency of reports of the child playing in the kitchen while the parent cooks or the house/apartment having a working smoke detector.^ Conclusion: We found that general fire safety knowledge improved and the number of specific fire safety rules increased among the first to third grade children who participated in the Danger Rangers fire safety program. However, it did not significantly increase general fire safety knowledge among the pre-k/k group. This study also showed that a program targeted towards children has the potential to influence familial risk factors by proxy. The Danger Rangers Fire Safety Curriculum should be further evaluated by conducting a randomized controlled trial, using valid measures that assess fire safety attitudes, beliefs, behaviors, as well as fire/burn related outcomes.^
Resumo:
Background: HIV associated B cell exhaustion is a notable characteristic of HIV viremic adults. However, it is not known if such alterations are present in perinatal HIV infected children, whose viral dynamics differs from those seen in adults. In the present study we perform an analysis of B cells subsets and measure antigen-specific memory B cells (MBC) in a pediatric HIV infected cohort. ^ Methods: Peripheral mononuclear cells (PBMC) of perinatal HIV infected individuals are characterized into naïve (CD21hi/CD27−), classic (CD27+), tissue like (CD21lo/CD27 −) and activated MBC (CD27+CD21− ) by FACS. A memory ELISPOT assay is used to detect antibody secreting cells. We measure total IgG and antibodies specific for influenza, HBV, mumps, measles, rubella and VZV. Memory was expressed as spot forming cells (SPC) /million of PBMC. Wilcoxon rank-sum was used to compare unpaired groups and linear regression analysis was used to determine predictors of B cell dysfunction ^ Results: 41 HIV perinatal infected children are included (51.2% females and 65.9% Black). Age at study is median (range) 8.78 years (4.39-11.57). At the time of testing they have a CD4% of 30.9 (23.2-39.4), a viral load (VL) of 1.95 log10 copies/ml (1.68-3.29) and a cumulative VL of 3.4 log10 copy × days (2.7-4.0). Ninety two percent of the children are on cARV for > 6 months. Overall, HIV+ children compared with controls have a significant lower number of IgG and antigen specific SFC. In addition, they have a lower proportion of classical MBC 12.9 (8.09-19.85) vs 29.4 (18.7-39.05); 0.01, but a significant higher proportion of tissue like memory MBC 6.01 (2.79-12.7) vs 0.99 (0.87-1.38); 0.003, compared with controls. Patients are parsed on VL (<400 and ≥ 400 copies/ml) with the objective to evaluate the effect of VL on B cell status. Patients with a VL ≥ 400 copies/ml have a significantly lower IgG, HBV, measles, rubella and VZV SPC compared with those with a VL < 400 copies/ml. There are no significant differences in B cell subpopulations between the groups. A moderate negative correlation was observed between the time of cARV initiation and the frequency of IgG memory B cells, suggesting that early initiation of cARV appears to lead to a better functionality of the IgG memory B cells (P=0.05). A statistically significant positive correlation was observed between the total number of IgG memory cells and the number of antigen-specific memory B cells/SPCs. Suggesting that the progressive recovery of the IgG memory B cell pull goes along with a progressive increase in the number of antigen-specific SPCs. ^ Conclusion: A pediatric cohort in overall good status with respect to HIV infection and on ART has defects in B cell function and numbers (reduced total and antigen specific MBC and increased tissue like and reduced classical MBC).^
Resumo:
A 450 year spring-summer flood layer time series at seasonal resolution has been established from the varved sediment record of Lake Ammersee (southern Germany), applying a novel methodological approach. The main results are (1) the attainment of a precise chronology by microscopic varve counting, (2) the identification of detrital layers representing flood-triggered fluxes of catchment material into the lake, and (3) the recognition of the seasonality of these flood layers from their microstratigraphic position within a varve. Tracing flood layers in a proximal and a distal core and correlating them by application of the precise chronology provided information on the depositional processes. Comparing the seasonal flood layer record with daily runoff data of the inflowing River Ammer for the period from 1926 to 1999 allowed the definition of an approximate threshold in flood magnitude above which the formation of flood layers becomes very likely. Moreover, it was possible for the first time to estimate the "completeness" of the flood layer time series and to recognize that mainly floods in spring and summer, representing the main flood seasons in this region, are well preserved in the sediment archive. Their frequency distribution over the entire 450 year time series is not stationary but reveals maxima for colder periods of the Little Ice Age when solar activity was reduced. The observed spring-summer flood layer frequency further shows trends similar to those of the occurrence of flood-prone weather regimes since A.D. 1881, probably suggesting a causal link between solar variability and changes in midlatitude atmospheric circulation patterns.
Resumo:
Processes of founding and expanding cities in coastal areas have undergone great changes over time driven by environmental conditions. Coastal settlements looked for places above flood levels and away from swamps and other wetlands whenever possible. As populations grew, cities were extending trying to avoid low and wet lands. No city has been able to limit its growth. The risk of flooding can never be eliminated, but only reduced to the extent possible. Flooding of coastal areas is today dramatically attributed to eustasic sea level rise caused by global climate change. This can be inaccurate. Current climate change is generating an average sea level upward trend, but other regional and local factors result in this trend being accentuated in some places or attenuated, and even reversed, in others. Then, the intensity and frequency of coastal flooding around the planet, although not so much as a unique result of this general eustasic elevation, but rather of the superposition of marine and crustal dynamic elements, the former also climate-related, which give rise to a temporary raising in average sea level in the short term. Since the Little Ice Age the planet has been suffering a global warming change leading to sea level rise. The idea of being too obeying to anthropogenic factors may be attributed to Arrhenius (1896), though it is of much later highlight after the sixties of the last century. Never before, the human factor had been able of such an influence on climate. However, other types of changes in sea levels became apparent, resulting from vertical movements of the crust, modifications of sea basins due to continents fracturing, drifting and coming together, or to different types of climate patterns. Coastal zones are then doubly susceptible to floods. Precipitation immediately triggers pluvial flooding. If it continues upland or when snow and glaciers melt eventually fluvial flooding can occur. The urban development presence represents modifying factors. Additional interference is caused by river and waste water drainage systems. Climate also influences sea levels in coastal areas, where tides as well as the structure and dynamic of the geoid and its crust come into play. From the sea, waters can flood and break or push back berms and other coastline borders. The sea level, controlling the mouth of the main channel of the basin's drainage system, is ultimately what governs flood levels. A temporary rise in sea level acts as a dam at the mouth. Even in absence of that global change, so, floods are likely going to increase in many urban coastal areas. Some kind of innovative methodologies and practices should be needed to get more flood resilience cities
Resumo:
Two new features have been proposed and used in the Rich Transcription Evaluation 2009 by the Universidad Politécnica de Madrid, which outperform the results of the baseline system. One of the features is the intensity channel contribution, a feature related to the location of the speaker. The second feature is the logarithm of the interpolated fundamental frequency. It is the first time that both features are applied to the clustering stage of multiple distant microphone meetings diarization. It is shown that the inclusion of both features improves the baseline results by 15.36% and 16.71% relative to the development set and the RT 09 set, respectively. If we consider speaker errors only, the relative improvement is 23% and 32.83% on the development set and the RT09 set, respectively.
Resumo:
This study characterises the abatement effect of large dams with fixed-crest spillways under extreme design flood conditions. In contrast to previous studies using specific hydrographs for flow into the reservoir and simplifications to obtain analytical solutions, an automated tool was designed for calculations based on a Monte Carlo simulation environment, which integrates models that represent the different physical processes in watersheds with areas of 150?2000 km2. The tool was applied to 21 sites that were uniformly distributed throughout continental Spain, with 105 fixed-crest dam configurations. This tool allowed a set of hydrographs to be obtained as an approximation for the hydrological forcing of a dam and the characterisation of the response of the dam to this forcing. For all cases studied, we obtained a strong linear correlation between the peak flow entering the reservoir and the peak flow discharged by the dam, and a simple general procedure was proposed to characterise the peak-flow attenuation behaviour of the reservoir. Additionally, two dimensionless coefficients were defined to relate the variables governing both the generation of the flood and its abatement in the reservoir. Using these coefficients, a model was defined to allow for the estimation of the flood abatement effect of a reservoir based on the available information. This model should be useful in the hydrological design of spillways and the evaluation of the hydrological safety of dams. Finally, the proposed procedure and model were evaluated and representative applications were presented
Resumo:
This paper presents the security evaluation, energy consumption optimization, and spectrum scarcity analysis of artificial noise techniques to increase physical-layer security in Cognitive Wireless Sensor Networks (CWSNs). These techniques introduce noise into the spectrum in order to hide real information. Nevertheless, they directly affect two important parameters in Cognitive Wireless Sensor Networks (CWSNs), energy consumption and spectrum utilization. Both are affected because the number of packets transmitted by the network and the active period of the nodes increase. Security evaluation demonstrates that these techniques are effective against eavesdropper attacks, but also optimization allows for the implementation of these approaches in low-resource networks such as Cognitive Wireless Sensor Networks. In this work, the scenario is formally modeled and the optimization according to the simulation results and the impact analysis over the frequency spectrum are presented.
Resumo:
In this paper, the applicability of the FRA technique is discussed as a method for detecting inter-turn faults in stator windings. Firstly, this method is tested in an individual medium-voltage stator coil with satisfactory results. Secondly, the tests are extended to a medium-voltage induction motor stator winding, in which inter-turn faults are performed in every coil end of one phase. Results of the frequency response in case of inter-turn faults are evaluated in both cases for different fault resistance values. The experimental setup is also described for each experiment. The results of the application of this technique to the detection of inter-turn faults justify further research in optimizing this technique for preventive maintenance.
Resumo:
Sport and Performance Psychology is an ever-evolving specialty. While its development continues, it has not been without its challenges. Sport and performance psychologists work in a variety of settings and often come from similar, yet inherently different, training backgrounds. Individuals from both sport sciences and psychology have made compelling arguments as to which approach provides quality services to their respective clients. The question that remains, however, is what are these quality services? Who are the clients and what do they need from professionals in the field?Collegiate student athletes inherently face a number of typical issues related to their age and development. They also face a number of atypical difficulties as a result of their status as student athletes. As such, they provide an adequate sample of potential presenting issues for sport and performance psychologists. This current study utilized a qualitative, exploratory method to evaluate the presenting issues that brought clients to seek services from professionals.This paper seeks to establish a foundation for the development of a theoretical basis of the psychology of human performance, including both performance and general mental health concerns, and how sport and performance psychologists can most effectively intervene in this process. This paper is based on an analysis of seven years of service delivery within a NCAA Division I athletic department.Results indicate that collegiate student athletes seek services related to performance enhancement and general mental health at relative equal frequency. As such, training and service delivery in both areas would be most beneficial and best serve this population.  
Resumo:
The increasing economic competition drives the industry to implement tools that improve their processes efficiencies. The process automation is one of these tools, and the Real Time Optimization (RTO) is an automation methodology that considers economic aspects to update the process control in accordance with market prices and disturbances. Basically, RTO uses a steady-state phenomenological model to predict the process behavior, and then, optimizes an economic objective function subject to this model. Although largely implemented in industry, there is not a general agreement about the benefits of implementing RTO due to some limitations discussed in the present work: structural plant/model mismatch, identifiability issues and low frequency of set points update. Some alternative RTO approaches have been proposed in literature to handle the problem of structural plant/model mismatch. However, there is not a sensible comparison evaluating the scope and limitations of these RTO approaches under different aspects. For this reason, the classical two-step method is compared to more recently derivative-based methods (Modifier Adaptation, Integrated System Optimization and Parameter estimation, and Sufficient Conditions of Feasibility and Optimality) using a Monte Carlo methodology. The results of this comparison show that the classical RTO method is consistent, providing a model flexible enough to represent the process topology, a parameter estimation method appropriate to handle measurement noise characteristics and a method to improve the sample information quality. At each iteration, the RTO methodology updates some key parameter of the model, where it is possible to observe identifiability issues caused by lack of measurements and measurement noise, resulting in bad prediction ability. Therefore, four different parameter estimation approaches (Rotational Discrimination, Automatic Selection and Parameter estimation, Reparametrization via Differential Geometry and classical nonlinear Least Square) are evaluated with respect to their prediction accuracy, robustness and speed. The results show that the Rotational Discrimination method is the most suitable to be implemented in a RTO framework, since it requires less a priori information, it is simple to be implemented and avoid the overfitting caused by the Least Square method. The third RTO drawback discussed in the present thesis is the low frequency of set points update, this problem increases the period in which the process operates at suboptimum conditions. An alternative to handle this problem is proposed in this thesis, by integrating the classic RTO and Self-Optimizing control (SOC) using a new Model Predictive Control strategy. The new approach demonstrates that it is possible to reduce the problem of low frequency of set points updates, improving the economic performance. Finally, the practical aspects of the RTO implementation are carried out in an industrial case study, a Vapor Recompression Distillation (VRD) process located in Paulínea refinery from Petrobras. The conclusions of this study suggest that the model parameters are successfully estimated by the Rotational Discrimination method; the RTO is able to improve the process profit in about 3%, equivalent to 2 million dollars per year; and the integration of SOC and RTO may be an interesting control alternative for the VRD process.
Resumo:
The present paper addresses the analysis of structural vibration transmission in the presence of structural joints. The problem is tackled from a numerical point of view, analyzing some scenarios by using finite element models. The numerical results obtained making use of this process are then compared with those evaluated using the EN 12354 standard vibration reduction index concept. It is shown that, even for the simplest cases, the behavior of a structural joint is complex and evidences the frequency dependence. Comparison with results obtained by empirical formulas reveals that those of the standards cannot accurately reproduce the expected behavior, and thus indicate that alternative complementary calculation procedures are required. A simple methodology to estimate the difference between numerical and standard predictions is here proposed allowing the calculation of an adaptation term that makes both approaches converge. This term was found to be solution-dependent, and thus should be evaluated for each structure.
Resumo:
Mode of access: Internet.
Resumo:
Mode of access: Internet.