948 resultados para deterministic safety analysis
First order k-th moment finite element analysis of nonlinear operator equations with stochastic data
Resumo:
We develop and analyze a class of efficient Galerkin approximation methods for uncertainty quantification of nonlinear operator equations. The algorithms are based on sparse Galerkin discretizations of tensorized linearizations at nominal parameters. Specifically, we consider abstract, nonlinear, parametric operator equations J(\alpha ,u)=0 for random input \alpha (\omega ) with almost sure realizations in a neighborhood of a nominal input parameter \alpha _0. Under some structural assumptions on the parameter dependence, we prove existence and uniqueness of a random solution, u(\omega ) = S(\alpha (\omega )). We derive a multilinear, tensorized operator equation for the deterministic computation of k-th order statistical moments of the random solution's fluctuations u(\omega ) - S(\alpha _0). We introduce and analyse sparse tensor Galerkin discretization schemes for the efficient, deterministic computation of the k-th statistical moment equation. We prove a shift theorem for the k-point correlation equation in anisotropic smoothness scales and deduce that sparse tensor Galerkin discretizations of this equation converge in accuracy vs. complexity which equals, up to logarithmic terms, that of the Galerkin discretization of a single instance of the mean field problem. We illustrate the abstract theory for nonstationary diffusion problems in random domains.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.
Resumo:
Social domains are classes of interpersonal processes each with distinct procedural rules underpinning mutual understanding, emotion regulation and action. We describe the features of three domains of family life – safety, attachment and discipline/expectation – and contrast them with exploratory processes in terms of the emotions expressed, the role of certainty versus uncertainty, and the degree of hierarchy in an interaction. We argue that everything that people say and do in family life carries information about the type of interaction they are engaged in – that is, the domain. However, sometimes what they say or how they behave does not make the domain clear, or participants in the social interactions are not in the same domain (there is a domain mismatch). This may result in misunderstandings, irresolvable arguments or distress. We describe how it is possible to identify domains and judge whether they are clear and unclear, and matched and mismatched, in observed family interactions and in accounts of family processes. This then provides a focus for treatment and helps to define criteria for evaluating outcomes.
Resumo:
This study has explored the prediction errors of tropical cyclones (TCs) in the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) for the Northern Hemisphere summer period for five recent years. Results for the EPS are contrasted with those for the higher-resolution deterministic forecasts. Various metrics of location and intensity errors are considered and contrasted for verification based on IBTrACS and the numerical weather prediction (NWP) analysis (NWPa). Motivated by the aim of exploring extended TC life cycles, location and intensity measures are introduced based on lower-tropospheric vorticity, which is contrasted with traditional verification metrics. Results show that location errors are almost identical when verified against IBTrACS or the NWPa. However, intensity in the form of the mean sea level pressure (MSLP) minima and 10-m wind speed maxima is significantly underpredicted relative to IBTrACS. Using the NWPa for verification results in much better consistency between the different intensity error metrics and indicates that the lower-tropospheric vorticity provides a good indication of vortex strength, with error results showing similar relationships to those based on MSLP and 10-m wind speeds for the different forecast types. The interannual variation in forecast errors are discussed in relation to changes in the forecast and NWPa system and variations in forecast errors between different ocean basins are discussed in terms of the propagation characteristics of the TCs.
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.
Resumo:
We have investigated plasma turbulence at the edge of a tokamak plasma using data from electrostatic potential fluctuations measured in the Brazilian tokamak TCABR. Recurrence quantification analysis has been used to provide diagnostics of the deterministic content of the series. We have focused our analysis on the radial dependence of potential fluctuations and their characterization by recurrence-based diagnostics. Our main result is that the deterministic content of the experimental signals is most pronounced at the external part of the plasma column just before the plasma radius. Since the chaoticity of the signals follows the same trend, we have concluded that the electrostatic plasma turbulence at the tokamak plasma edge can be partially explained by means of a deterministic nonlinear system. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Recently, the deterministic tourist walk has emerged as a novel approach for texture analysis. This method employs a traveler visiting image pixels using a deterministic walk rule. Resulting trajectories provide clues about pixel interaction in the image that can be used for image classification and identification tasks. This paper proposes a new walk rule for the tourist which is based on contrast direction of a neighborhood. The yielded results using this approach are comparable with those from traditional texture analysis methods in the classification of a set of Brodatz textures and their rotated versions, thus confirming the potential of the method as a feasible texture analysis methodology. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The cost of a road construction over its service life is a function of the design, quality of construction, maintenance strategies and maintenance operations. Unfortunately, designers often neglect a very important aspect which is the possibility to perform future maintenance activities. The focus is mainly on other aspects such as investment costs, traffic safety, aesthetic appearance, regional development and environmental effects. This licentiate thesis is a part of a Ph.D. project entitled “Road Design for lower maintenance costs” that aims to examine how the life-cycle costs can be optimized by selection of appropriate geometrical designs for the roads and their components. The result is expected to give a basis for a new method used in the road planning and design process using life-cycle cost analysis with particular emphasis on road maintenance. The project started with a review of literature with the intention to study conditions causing increased needs for road maintenance, the efforts made by the road authorities to satisfy those needs and the improvement potential by consideration of maintenance aspects during planning and design. An investigation was carried out to identify the problems which obstruct due consideration of maintenance aspects during the road planning and design process. This investigation focused mainly on the road planning and design process at the Swedish Road Administration. However, the road planning and design process in Denmark, Finland and Norway were also roughly evaluated to gain a broader knowledge about the research subject. The investigation was carried out in two phases: data collection and data analysis. Data was collected by semi-structured interviews with expert actors involved in planning, design and maintenance and by a review of design-related documents. Data analyses were carried out using a method called “Change Analysis”. This investigation revealed a complex combination of problems which result in inadequate consideration of maintenance aspects. Several urgent needs for changes to eliminate these problems were identified. Another study was carried out to develop a model for calculation of the repair costs for damages of different road barrier types and to analyse how factors such as road type, speed limits, barrier types, barrier placement, type of road section, alignment and seasonal effects affect the barrier damages and the associated repair costs. This study was carried out using a method called the “Case Study Research Method”. Data was collected from 1087 barrier repairs in two regional offices of the Swedish Road Administration, the Central Region and the Western Region. A table was established for both regions containing the repair cost per vehicle kilometre for different combinations of barrier types, road types and speed limits. This table can be used by the designers in the calculation of the life-cycle costs for different road barrier types.
Resumo:
The narrative of the United States is of a "nation of immigrants" in which the language shift patterns of earlier ethnolinguistic groups have tended towards linguistic assimilation through English. In recent years, however, changes in the demographic landscape and language maintenance by non-English speaking immigrants, particularly Hispanics, have been perceived as threats and have led to calls for an official English language policy.This thesis aims to contribute to the study of language policy making from a societal security perspective as expressed in attitudes regarding language and identity originating in the daily interaction between language groups. The focus is on the role of language and American identity in relation to immigration. The study takes an interdisciplinary approach combining language policy studies, security theory, and critical discourse analysis. The material consists of articles collected from four newspapers, namely USA Today, The New York Times, Los Angeles Times, and San Francisco Chronicle between April 2006 and December 2007.Two discourse types are evident from the analysis namely Loyalty and Efficiency. The former is mainly marked by concerns of national identity and contains speech acts of security related to language shift, choice and English for unity. Immigrants are represented as dehumanised, and harmful. Immigration is given as sovereignty-related, racial, and as war. The discourse type of Efficiency is mainly instrumental and contains speech acts of security related to cost, provision of services, health and safety, and social mobility. Immigrants are further represented as a labour resource. These discourse types reflect how the construction of the linguistic 'we' is expected to be maintained. Loyalty is triggered by arguments that the collective identity is threatened and is itself used in reproducing the collective 'we' through hegemonic expressions of monolingualism in the public space and semi-public space. The denigration of immigrants is used as a tool for enhancing societal security through solidarity and as a possible justification for the denial of minority rights. Also, although language acquisition patterns still follow the historical trend of language shift, factors indicating cultural separateness such as the appearance of speech communities or the use of minority languages in the public space and semi-public space have led to manifestations of intolerance. Examples of discrimination and prejudice towards minority groups indicate that the perception of worth of a shared language differs from the actual worth of dominant language acquisition for integration purposes. The study further indicates that the efficient working of the free market by using minority languages to sell services or buy labour is perceived as conflicting with nation-building notions since it may create separately functioning sub-communities with a new cultural capital recognised as legitimate competence. The discourse types mainly represent securitising moves constructing existential threats. The perception of threat and ideas of national belonging are primarily based on a zero-sum notion favouring monolingualism. Further, the identity of the immigrant individual is seen as dynamic and adaptable to assimilationist measures whereas the identity of the state and its members are perceived as static. Also, the study shows that debates concerning language status are linked to extra-linguistic matters. To conclude, policy makers in the US need to consider the relationship between four factors, namely societal security based on collective identity, individual/human security, human rights, and a changing linguistic demography, for proposed language intervention measures to be successful.
Resumo:
The Survivability of Swedish Emergency Management Related Research Centers and Academic Programs: A Preliminary Sociology of Science Analysis Despite being a relatively safe nation, Sweden has four different universities supporting four emergency management research centers and an equal and growing number of academic programs. In this paper, I discuss how these centers and programs survive within the current organizational environment. The sociology of science or the sociology of scientific knowledge perspectives should provide a theoretical guide. Yet, scholars of these perspectives have produced no research on these related topics. Thus, the population ecology model and the notion of organizational niche provide my theoretical foundation. My data come from 26 interviews from those four institutions, the gathering of documents, and observations. I found that each institution has found its own niche with little or no competition – with one exception. Three of the universities do have an international focus. Yet, their foci have minimal overlap. Finally, I suggest that key aspects of Swedish culture, including safety, and a need aid to the poor, help explain the extensive funding these centers and programs receive to survive.
Resumo:
Negative outcomes of a poor work environment are more frequent among young workers. The aim of the current study was to study former pupils’ conditions concerning occupational health and safety by investigating the workplaces’, safety climate, the degree of implementation of SWEM and the their introduction programs. Four branches were included in the study: Industrial, Restaurant, Transport and Handicraft, specialising in wood. Semi-structured dialogues were undertaken with 15 employers at companies in which former pupils were employed. They also answered a questionnaire about SWEM. Former pupils and experienced employees were upon the same occasion asked to fill in a questionnaire about safety climate at the workplace. Workplace introduction programs varied and were strongly linked to company size. Most of the former pupils and experienced employees rated the safety climate at their company as high, or good. Employers in three of the branches rated the SWEM implemented at their workplaces to be effective. The Industry companies, which had the largest workplaces, gave the most systematic and workplace introduction for new employees. There are no results from this study explaining the fact that young workers have a higher risk for workplace accidents.
Resumo:
Background: Tens of millions of patients worldwide suffer from avoidable disabling injuries and death every year. Measuring the safety climate in health care is an important step in improving patient safety. The most commonly used instrument to measure safety climate is the Safety Attitudes Questionnaire (SAQ). The aim of the present study was to establish the validity and reliability of the translated version of the SAQ. Methods: The SAQ was translated and adapted to the Swedish context. The survey was then carried out with 374 respondents in the operating room (OR) setting. Data was received from three hospitals, a total of 237 responses. Cronbach's alpha and confirmatory factor analysis (CFA) was used to evaluate the reliability and validity of the instrument. Results: The Cronbach's alpha values for each of the factors of the SAQ ranged between 0.59 and 0.83. The CFA and its goodness-of-fit indices (SRMR 0.055, RMSEA 0.043, CFI 0.98) showed good model fit. Intercorrelations between the factors safety climate, teamwork climate, job satisfaction, perceptions of management, and working conditions showed moderate to high correlation with each other. The factor stress recognition had no significant correlation with teamwork climate, perception of management, or job satisfaction. Conclusions: Therefore, the Swedish translation and psychometric testing of the SAQ (OR version) has good construct validity. However, the reliability analysis suggested that some of the items need further refinement to establish sound internal consistency. As suggested by previous research, the SAQ is potentially a useful tool for evaluating safety climate. However, further psychometric testing is required with larger samples to establish the psychometric properties of the instrument for use in Sweden.
Resumo:
The Hazard Analysis and Critical Control Point (HACCP) is a preventive system that intends to guarantee the safety and harmlessness of food. It improves the quality of products as it eliminates possible defects during the process, and saves costs by practically eliminating final product inspection. This work describes the typical hazards encountered on the mushroom processing line for fresh consumption. Throughout the process, only the reception stage of mushrooms has been considered a critical control point (CCP). The main hazards at this stage were: the presence of unauthorised phytosanitary products; larger doses of such products than those permitted; the presence of pathogenic bacteria or thermo-stable enterotoxins. Putting into practice such knowledge would provide any industry that processes mushrooms for fresh consumption with a self-control HACCP-based system for its own productions.