969 resultados para Probabilidade de default


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Em projetos de inovação por design, a concepção detalhada e sistêmica do projeto amplia a probabilidade de êxito de modo a preservar e aumentar o retorno dos recursos investidos. Privilegia-se o desenvolvimento em etapas para ampliar a consciência do ecossistema e valores associados ao projeto. Deste modo, o processo é conduzido de modo mais adequado partindo do contexto e objetivos à sua especificação conceitual para execução. Um levantamento dos parâmetros, processos, atividades, formas de conexão e interação, ambiente, elementos do projeto e contexto semântico estruturam um percurso metodológico em módulos e ferramentas que refinam gradualmente a partir do objetivo inicialmente exposto ao projeto de inovação bem-sucedido.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predicting and averting the spread of invasive species is a core focus of resource managers in all ecosystems. Patterns of invasion are difficult to forecast, compounded by a lack of user-friendly species distribution model (SDM) tools to help managers focus control efforts. This paper presents a web-based cellular automata hybrid modeling tool developed to study the invasion pattern of lionfish (Pterois volitans/miles) in the western Atlantic and is a natural extension our previous lionfish study. Our goal is to make publically available this hybrid SDM tool and demonstrate both a test case (P. volitans/miles) and a use case (Caulerpa taxifolia). The software derived from the model, titled Invasionsoft, is unique in its ability to examine multiple default or user-defined parameters, their relation to invasion patterns, and is presented in a rich web browser-based GUI with integrated results viewer. The beta version is not species-specific and includes a default parameter set that is tailored to the marine habitat. Invasionsoft is provided as copyright protected freeware at http://www.invasionsoft.com.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel monolithically integrated Michelson interferometer using intersecting twin-contact semiconductor optical amplifiers is proposed and implemented whereby the two arms are gain imbalanced to give enhanced noise suppression. Experimental OSNR improvements of 8.4 dB for pulses with durations 8 ps and by default ER of 14 dB are demonstrated for low driving currents of between 25 and 30 mA. This is believed to be the smallest Michelson interferometer to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Most assessments of fish stocks use some measure of the reproductive potential of a population, such as spawning biomass. However, the correlation between spawning biomass and reproductive potential is not always strong, and it likely is weakest in the tropics and subtropics, where species tend to exhibit indeterminate fecundity and release eggs in batches over a protracted spawning season. In such cases, computing annual reproductive output requires estimates of batch fecundity and the annual number of batches—the latter subject to spawning frequency and duration of spawning season. Batch fecundity is commonly measured by age (or size), but these other variables are not. Without the relevant data, the annual number of batches is assumed to be invariant across age. We reviewed the literature and found that this default assumption lacks empirical support because both spawning duration and spawning frequency generally increase with age or size. We demonstrate effects of this assumption on measures of reproductive value and spawning potential ratio, a metric commonly used to gauge stock status. Model applications showed substantial sensitivity to age dependence in the annual number of batches. If the annual number of batches increases with age but is incorrectly assumed to be constant, stock assessment models would tend to overestimate the biological reference points used for setting harvest rates. This study underscores the need to better understand the age- or size-dependent contrast in the annual number of batches, and we conclude that, for species without evidence to support invariance, the default assumption should be replaced with one that accounts for age- or size-dependence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The occurrence of hypoxia, or low dissolved oxygen, is increasing in coastal waters worldwide and represents a significant threat to the health and economy of our Nation’s coasts and Great Lakes. This trend is exemplified most dramatically off the coast of Louisiana and Texas, where the second largest eutrophication-related hypoxic zone in the world is associated with the nutrient pollutant load discharged by the Mississippi and Atchafalaya Rivers. Aquatic organisms require adequate dissolved oxygen to survive. The term “dead zone” is often used in reference to the absence of life (other than bacteria) from habitats that are devoid of oxygen. The inability to escape low oxygen areas makes immobile species, such as oysters and mussels, particularly vulnerable to hypoxia. These organisms can become stressed and may die due to hypoxia, resulting in significant impacts on marine food webs and the economy. Mobile organisms can flee the affected area when dissolved oxygen becomes too low. Nevertheless, fish kills can result from hypoxia, especially when the concentration of dissolved oxygen drops rapidly. New research is clarifying when hypoxia will cause fish kills as opposed to triggering avoidance behavior by fish. Further, new studies are better illustrating how habitat loss associated with hypoxia avoidance can impose ecological and economic costs, such as reduced growth in commercially harvested species and loss of biodiversity, habitat, and biomass. Transient or “diel-cycling” hypoxia, where conditions cycle from supersaturation of oxygen late in the afternoon to hypoxia or anoxia near dawn, most often occurs in shallow, eutrophic systems (e.g., nursery ground habitats) and may have pervasive impacts on living resources because of both its location and frequency of occurrence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three-dimensional bumps have been developed and investigated, aiming at the two major objectives of shock-wave / boundary-layer interaction control, i.e. drag reduction and suppression of separation, simultaneously. An experimental investigation has been conducted for a default rounded bump in channel now at University of Cambridge and a computational study has been performed for a spanwise series of rounded bumps mounted on a transonic aerofoil at University of Stuttgart. Observed in both cases are wave drag reduction owing to A-shock structures produced by three-dimensional surface bumps and mild control effects on the boundary layer. The effects of rough surface and tall extension have been investigated as well as several geometric variations and multiple bump configurations. A double configuration of narrow rounded bumps has been found to best perform amongst the tested, considerably reducing wave drag through a well-established A-shock structure with little viscous penalty and thus achieving substantial overall drag reduction. Counter-rotating streamwise vortex pairs have been produced by some configurations as a result of local flow separation, but they have been observed to be confined in relatively narrow wake regions, expected to be beneficial in suppressing large-scale separation under off-design condition despite increase of viscous drag. On the whole a large potential of three-dimensional control with discrete rounded bumps has been demonstrated both experimentally and numerically, and experimental investigation of bumps fitted on a transonic aerofoil or wing is suggested toward practical application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The unscented Kalman filter (UKF) is a widely used method in control and time series applications. The UKF suffers from arbitrary parameters necessary for a step known as sigma point placement, causing it to perform poorly in nonlinear problems. We show how to treat sigma point placement in a UKF as a learning problem in a model based view. We demonstrate that learning to place the sigma points correctly from data can make sigma point collapse much less likely. Learning can result in a significant increase in predictive performance over default settings of the parameters in the UKF and other filters designed to avoid the problems of the UKF, such as the GP-ADF. At the same time, we maintain a lower computational complexity than the other methods. We call our method UKF-L. ©2010 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The unscented Kalman filter (UKF) is a widely used method in control and time series applications. The UKF suffers from arbitrary parameters necessary for sigma point placement, potentially causing it to perform poorly in nonlinear problems. We show how to treat sigma point placement in a UKF as a learning problem in a model based view. We demonstrate that learning to place the sigma points correctly from data can make sigma point collapse much less likely. Learning can result in a significant increase in predictive performance over default settings of the parameters in the UKF and other filters designed to avoid the problems of the UKF, such as the GP-ADF. At the same time, we maintain a lower computational complexity than the other methods. We call our method UKF-L. © 2011 Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper aims to solve the fault tolerant control problem of a wind turbine benchmark. A hierarchical controller with model predictive pre-compensators, a global model predictive controller and a supervisory controller is proposed. In the model predictive pre-compensator, an extended Kalman Filter is designed to estimate the system states and various fault parameters. Based on the estimation, a group of model predictive controllers are designed to compensate the fault effects for each component of the wind turbine. The global MPC is used to schedule the operation of the components and exploit potential system-level redundancies. Extensive simulations of various fault conditions show that the proposed controller has small transients when faults occur and uses smoother and smaller generator torque and pitch angle inputs than the default controller. This paper shows that MPC can be a good candidate for fault tolerant controllers, especially the one with an adaptive internal model combined with a parameter estimation and update mechanism, such as an extended Kalman Filter. © 2012 IFAC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information and Communication Technology (ICT) is becoming increasingly central to many people’s lives, making it possible to be connected in any place at any time, be unceasingly and instantly informed, and benefit from greater economic and educational opportunities. With all the benefits afforded by these new-found capabilities, however, come potential drawbacks. A plethora of new PCs, laptops, tablets, smartphones, Bluetooth, the internet, Wi-Fi (the list goes on) expect us to know or be able to guess, what, where and when to connect, click, double-click, tap, flick, scroll, in order to realise these benefits, and to have the physical and cognitive capability to do all these things. One of the groups most affected by this increase in high-demand technology is older people. They do not understand and use technology in the same way that younger generations do, because they grew up in the simpler electro-mechanical era and embedded that particular model of the world in their minds. Any consequential difficulty in familiarising themselves with modern ICT and effectively applying it to their needs can also be exacerbated by age-related changes in vision, motor control and cognitive functioning. Such challenges lead to digital exclusion. Much has been written about this topic over the years, usually by academics from the area of inclusive product design. The issue is complex and it is fair to say that no one researcher has the whole picture. It is difficult to understand and adequately address the issue of digital exclusion among the older generation without looking across disciplines and at industry’s and government’s understanding, motivation and efforts toward resolving this important problem. To do otherwise is to risk misunderstanding the true impact that ICT has and could have on people’s lives across all generations. In this European year of Active Ageing and Solidarity between Generations and as the British government is moving forward with its Digital by Default initiative as part of a wider objective to make ICT accessible to as many people as possible by 2015, the Engineering Design Centre (EDC) at the University of Cambridge collaborated with BT to produce a book of thought pieces to address, and where appropriate redress, these important and long-standing issues. “Ageing, Adaption and Accessibility: Time for the Inclusive Revolution!” brings together opinions and insights from twenty one prominent thought leaders from government, industry and academia regarding the problems, opportunities and strategies for combating digital exclusion among senior citizens. The contributing experts were selected as individuals, rather than representatives of organisations, to provide the broadest possible range of perspectives. They are renowned in their respective fields and their opinions are formed not only from their own work, but also from the contributions of others in their area. Their views were elicited through conversations conducted by the editors of this book who then drafted the thought pieces to be edited and approved by the experts. We hope that this unique collection of thought pieces will give you a broader perspective on ageing, people’s adaption to the ever changing world of technology and insights into better ways of designing digital devices and services for the older population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ideally, one would like to perform image search using an intuitive and friendly approach. Many existing image search engines, however, present users with sets of images arranged in some default order on the screen, typically the relevance to a query, only. While this certainly has its advantages, arguably, a more flexible and intuitive way would be to sort images into arbitrary structures such as grids, hierarchies, or spheres so that images that are visually or semantically alike are placed together. This paper focuses on designing such a navigation system for image browsers. This is a challenging task because arbitrary layout structure makes it difficult - if not impossible - to compute cross-similarities between images and structure coordinates, the main ingredient of traditional layouting approaches. For this reason, we resort to a recently developed machine learning technique: kernelized sorting. It is a general technique for matching pairs of objects from different domains without requiring cross-domain similarity measures and hence elegantly allows sorting images into arbitrary structures. Moreover, we extend it so that some images can be preselected for instance forming the tip of the hierarchy allowing to subsequently navigate through the search results in the lower levels in an intuitive way. Copyright 2010 ACM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biofuels are increasingly promoted worldwide as a means for reducing greenhouse gas (GHG) emissions from transport. However, current regulatory frameworks and most academic life cycle analyses adopt a deterministic approach in determining the GHG intensities of biofuels and thus ignore the inherent risk associated with biofuel production. This study aims to develop a transparent stochastic method for evaluating UK biofuels that determines both the magnitude and uncertainty of GHG intensity on the basis of current industry practices. Using wheat ethanol as a case study, we show that the GHG intensity could span a range of 40-110 gCO2e MJ-1 when land use change (LUC) emissions and various sources of uncertainty are taken into account, as compared with a regulatory default value of 44 gCO2e MJ-1. This suggests that the current deterministic regulatory framework underestimates wheat ethanol GHG intensity and thus may not be effective in evaluating transport fuels. Uncertainties in determining the GHG intensity of UK wheat ethanol include limitations of available data at a localized scale, and significant scientific uncertainty of parameters such as soil N2O and LUC emissions. Biofuel polices should be robust enough to incorporate the currently irreducible uncertainties and flexible enough to be readily revised when better science is available. © 2013 IOP Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we performed an evaluation of decay heat power of advanced, fast spectrum, lead and molten salt-cooled reactors, with flexible conversion ratio. The decay heat power was calculated using the BGCore computer code, which explicitly tracks over 1700 isotopes in the fuel throughout its burnup and subsequent decay. In the first stage, the capability of the BGCore code to accurately predict the decay heat power was verified by performing a benchmark calculation for a typical UO2 fuel in a Pressurized Water Reactor environment against the (ANSI/ANS-5.1-2005, "Decay Heat Power in Light Water Reactors," American National Standard) standard. Very good agreement (within 5%) between the two methods was obtained. Once BGCore calculation capabilities were verified, we calculated decay power for fast reactors with different coolants and conversion ratios, for which no standard procedure is currently available. Notable differences were observed for the decay power of the advanced reactor as compared with the conventional UO2 LWR. The importance of the observed differences was demonstrated by performing a simulation of a Station Blackout transient with the RELAP5 computer code for a lead-cooled fast reactor. The simulation was performed twice: using the code-default ANS-79 decay heat curve and using the curve calculated specifically for the studied core by BGCore code. The differences in the decay heat power resulted in failure to meet maximum cladding temperature limit criteria by ∼100 °C in the latter case, while in the transient simulation with the ANS-79 decay heat curve, all safety limits were satisfied. The results of this study show that the design of new reactor safety systems must be based on decay power curves specific to each individual case in order to assure the desired performance of these systems. © 2009 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmission Volume Phase Holographic Grating (VPHG) is adopted as spectral element in the real-time Optical Channel Performance Monitor (OCPM), which is in dire need in the Dense Wavelength -division-multiplexing(DATDM) system. And the tolerance of incident angle, which can be fully determined by two angles: 6 and (p, is finally inferred in this paper. Commonly, the default setting is that the incident plane is perpendicular to the fringes when the incident angle is mentioned. Now the situation out of the vertical is discussed. By combining the theoretic analysis of VPHG with its use in OCPM and changing 6 and (0 precisely in the computation and experiment, the two physical quantities which can fully specify the performance of VPHG the diffraction efficiency and the resolution, are analyzed. The results show that the diffraction efficiency varies greatly with the change of 6 or (p. But from the view of the whole C-band, only the peak diffraction efficiency drifts to another wavelength. As for the resolution, it deteriorates more rapidly than diffraction efficiency with the change of (p, while more slowly with the change of theta. Only if \phi\less than or equal to+/-1degrees and alpha(B) -0.5 less than or equal to theta less than or equal to alpha(B) + 0.5, the performance of the VPHG would be good enough to be used in OCPM system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simulating wave nearshore (SWAN) wave model has been widely used in coastal areas, lakes and estuaries. However, we found a poor agreement between modeling results and measurements in analyzing the chosen four typical cases when we used the default parameters of the source function formulas of the SWAN to make wave simulation for the Bohai Sea. Also, it was found that at the same wind process the simulated results of two wind generation expressions (Komen, Janssen) demonstrated a large difference. Further study showed that the proportionality coefficient alpha in linear growth term of wave growth source function plays an unperceived role in the process of wave development. Based on experiments and analysis, we thought that the coefficient alpha should change rather than be a constant. Therefore, the coefficient alpha changing with the variation of friction velocity U (*) was introduced into the linear growth term of wave growth source function. Four weather processes were adopted to validate the improvement in the linear growth term. The results from the improved coefficient alpha agree much better with the measurements than those from the default constant coefficient alpha. Furthermore, the large differences of results between Komen wind generation expression and Janssen wind generation expression were eliminated. We also experimented with the four weather processes to test the new white-capping mechanisms based on the cumulative steepness method. It was found that the parameters of the new white-capping mechanisms are not suitable for the Bohai Sea, but Alkyon's white-capping mechanisms can be applicable to the Bohai Sea after amendments, demonstrating that this improvement of parameter alpha can improve the simulated results of the Bohai Sea.