832 resultados para homeostatic model assessment
Resumo:
Uncertainties associated with the structural model and measured vibration data may lead to unreliable damage detection. In this paper, we show that geometric and measurement uncertainty cause considerable problem in damage assessment which can be alleviated by using a fuzzy logic-based approach for damage detection. Curvature damage factor (CDF) of a tapered cantilever beam are used as damage indicators. Monte Carlo simulation (MCS) is used to study the changes in the damage indicator due to uncertainty in the geometric properties of the beam. Variation in these CDF measures due to randomness in structural parameter, further contaminated with measurement noise, are used for developing and testing a fuzzy logic system (FLS). Results show that the method correctly identifies both single and multiple damages in the structure. For example, the FLS detects damage with an average accuracy of about 95 percent in a beam having geometric uncertainty of 1 percent COV and measurement noise of 10 percent in single damage scenario. For multiple damage case, the FLS identifies damages in the beam with an average accuracy of about 94 percent in the presence of above mentioned uncertainties. The paper brings together the disparate areas of probabilistic analysis and fuzzy logic to address uncertainty in structural damage detection.
Resumo:
Commercial environments may receive only a fraction of expected genetic gains for growth rate as predicted from the selection environment. This fraction is result of undesirable genotype-by-environment interactions (GxE) and measured by the genetic correlation (rg) of growth between environments. Rapid estimates of genetic correlation achieved in one generation are notoriously difficult to estimate with precision. A new design is proposed where genetic correlations can be estimated by utilising artificial mating from cryopreserved semen and unfertilised eggs stripped from a single female. We compare a traditional phenotype analysis of growth to a threshold model where only the largest fish are genotyped for sire identification. The threshold model was robust to differences in family mortality differing up to 30%. The design is unique as it negates potential re-ranking of families caused by an interaction between common maternal environmental effects and growing environment. The design is suitable for rapid assessment of GxE over one generation with a true 0.70 genetic correlation yielding standard errors as low as 0.07. Different design scenarios were tested for bias and accuracy with a range of heritability values, number of half-sib families created, number of progeny within each full-sib family, number of fish genotyped, number of fish stocked, differing family survival rates and at various simulated genetic correlation levels.
Resumo:
Common coral trout Plectropomus leopardus is an iconic fish of the Great Barrier Reef (GBR) and is the most important fish for the commercial fishery there. Most of the catch is exported live to Asia. This stock assessment was undertaken in response to falls in catch sizes and catch rates in recent years, in order to gauge the status of the stock. It is the first stock assessment ever conducted of coral trout on the GBR, and brings together a multitude of different data sources for the first time. The GBR is very large and was divided into a regional structure based on the Bioregions defined by expert committees appointed by the Great Barrier Reef Marine Park Authority (GBRMPA) as part of the 2004 rezoning of the GBR. The regional structure consists of six Regions, from the Far Northern Region in the north to the Swains and Capricorn–Bunker Regions in the south. Regions also closely follow the boundaries between Bioregions. Two of the northern Regions are split into Subregions on the basis of potential changes in fishing intensity between the Subregions; there are nine Subregions altogether, which include four Regions that are not split. Bioregions are split into Subbioregions along the Subregion boundaries. Finally, each Subbioregion is split into a “blue” population which is open to fishing and a “green” population which is closed to fishing. The fishery is unusual in that catch rates as an indicator of abundance of coral trout are heavily influenced by tropical cyclones. After a major cyclone, catch rates fall for two to three years, and rebound after that. This effect is well correlated with the times of occurrence of cyclones, and usually occurs in the same month that the cyclone strikes. However, statistical analyses correlating catch rates with cyclone wind energy did not provide significantly different catch rate trends. Alternative indicators of cyclone strength may explain more of the catch rate decline, and future work should investigate this. Another feature of catch rates is the phenomenon of social learning in coral trout populations, whereby when a population of coral trout is fished, individuals quickly learn not to take bait. Then the catch rate falls sharply even when the population size is still high. The social learning may take place by fish directly observing their fellows being hooked, or perhaps heeding a chemo-sensory cue emitted by fish that are hooked. As part of the assessment, analysis of data from replenishment closures of Boult Reef in the Capricorn–Bunker Region (closed 1983–86) and Bramble Reef in the Townsville Subregion (closed 1992–95) estimated a strong social learning effect. A major data source for the stock assessment was the large collection of underwater visual survey (UVS) data collected by divers who counted the coral trout that they sighted. This allowed estimation of the density of coral trout in the different Bioregions (expressed as a number of fish per hectare). Combined with mapping data of all the 3000 or so reefs making up the GBR, the UVS results provided direct estimates of the population size in each Subbioregion. A regional population dynamic model was developed to account for the intricacies of coral trout population dynamics and catch rates. Because the statistical analysis of catch rates did not attribute much of the decline to tropical cyclones, (and thereby implied “real” declines in biomass), and because in contrast the UVS data indicate relatively stable population sizes, model outputs were unduly influenced by the unlikely hypothesis that falling catch rates are real. The alternative hypothesis that UVS data are closer to the mark and declining catch rates are an artefact of spurious (e.g., cyclone impact) effects is much more probable. Judging by the population size estimates provided by the UVS data, there is no biological problem with the status of coral trout stocks. The estimate of the total number of Plectropomus leopardus on blue zones on the GBR in the mid-1980s (the time of the major UVS series) was 5.34 million legal-sized fish, or about 8400 t exploitable biomass, with an 2 additional 3350 t in green zones (using the current zoning which was introduced on 1 July 2004). For the offshore regions favoured by commercial fishers, the figure was about 4.90 million legal-sized fish in blue zones, or about 7700 t exploitable biomass. There is, however, an economic problem, as indicated by relatively low catch rates and anecdotal information provided by commercial fishers. The costs of fishing the GBR by hook and line (the only method compatible with the GBR’s high conservation status) are high, and commercial fishers are unable to operate profitably when catch rates are depressed (e.g., from a tropical cyclone). The economic problem is compounded by the effect of social learning in coral trout, whereby catch rates fall rapidly if fishers keep returning to the same fishing locations. In response, commercial fishers tend to spread out over the GBR, including the Far Northern and Swains Regions which are far from port and incur higher travel costs. The economic problem provides some logic to a reduction in the TACC. Such a reduction during good times, such as when the fishery is rebounding after a major tropical cyclone, could provide a net benefit to the fishery, as it would provide a margin of stock safety and make the fishery more economically robust by providing higher catch rates during subsequent periods of depressed catches. During hard times when catch rates are low (e.g., shortly after a major tropical cyclone), a change to the TACC would have little effect as even a reduced TACC would not come close to being filled. Quota adjustments based on catch rates should take account of long-term trends in order to mitigate variability and cyclone effects in data.
Resumo:
In this thesis the use of the Bayesian approach to statistical inference in fisheries stock assessment is studied. The work was conducted in collaboration of the Finnish Game and Fisheries Research Institute by using the problem of monitoring and prediction of the juvenile salmon population in the River Tornionjoki as an example application. The River Tornionjoki is the largest salmon river flowing into the Baltic Sea. This thesis tackles the issues of model formulation and model checking as well as computational problems related to Bayesian modelling in the context of fisheries stock assessment. Each article of the thesis provides a novel method either for extracting information from data obtained via a particular type of sampling system or for integrating the information about the fish stock from multiple sources in terms of a population dynamics model. Mark-recapture and removal sampling schemes and a random catch sampling method are covered for the estimation of the population size. In addition, a method for estimating the stock composition of a salmon catch based on DNA samples is also presented. For most of the articles, Markov chain Monte Carlo (MCMC) simulation has been used as a tool to approximate the posterior distribution. Problems arising from the sampling method are also briefly discussed and potential solutions for these problems are proposed. Special emphasis in the discussion is given to the philosophical foundation of the Bayesian approach in the context of fisheries stock assessment. It is argued that the role of subjective prior knowledge needed in practically all parts of a Bayesian model should be recognized and consequently fully utilised in the process of model formulation.
Resumo:
AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.
Resumo:
Four species of large mackerels (Scomberomorus spp.) co-occur in the waters off northern Australia and are important to fisheries in the region. State fisheries agencies monitor these species for fisheries assessment; however, data inaccuracies may exist due to difficulties with identification of these closely related species, particularly when specimens are incomplete from fish processing. This study examined the efficacy of using otolith morphometrics to differentiate and predict among the four mackerel species off northeastern Australia. Seven otolith measurements and five shape indices were recorded from 555 mackerel specimens. Multivariate modelling including linear discriminant analysis (LDA) and support vector machines, successfully differentiated among the four species based on otolith morphometrics. Cross validation determined a predictive accuracy of at least 96% for both models. An optimum predictive model for the four mackerel species was an LDA model that included fork length, feret length, feret width, perimeter, area, roundness, form factor and rectangularity as explanatory variables. This analysis may improve the accuracy of fisheries monitoring, the estimates based on this monitoring (i.e. mortality rate) and the overall management of mackerel species in Australia.
Resumo:
Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.
Resumo:
This paper introduces a policy-making support tool called ‘Micro-level Urban ecosystem Sustainability IndeX (MUSIX)’. The index serves as a sustainability assessment model that monitors six aspects of urban ecosystems, hydrology, ecology, pollution, location, design, and efficiency based on parcel-scale indicators. This index is applied in a case study investigation in the Gold Coast City, Queensland, Australia. The outcomes reveal that there are major environmental problems caused by increased impervious surfaces from growing urban development in the study area. The findings suggest that increased impervious surfaces are linked to increased surface runoff, car dependency, transport-related pollution, poor public transport accessibility, and unsustainable built environment. This paper presents how the MUSIX outputs can be used to guide policy-making through the evaluation of existing policies.
Resumo:
Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.
Resumo:
Indoor air quality is a critical factor in the classroom due to high people concentration in a unique space. Indoor air pollutant might increase the chance of both long and short-term health problems among students and staff, reduce the productivity of teachers and degrade the student’s learning environment and comfort. Adequate air distribution strategies may reduce risk of infection in classroom. So, the purpose of air distribution systems in a classroom is not only to maximize conditions for thermal comfort, but also to remove indoor contaminants. Natural ventilation has the potential to play a significant role in achieving improvements in IAQ. The present study compares the risk of airborne infection between Natural Ventilation (opening windows and doors) and a Split-System Air Conditioner in a university classroom. The Wells-Riley model was used to predict the risk of indoor airborne transmission of infectious diseases such as influenza, measles and tuberculosis. For each case, the air exchange rate was measured using a CO2 tracer gas technique. It was found that opening windows and doors provided an air exchange rate of 2.3 air changes/hour (ACH), while with the Split System it was 0.6 ACH. The risk of airborne infection ranged between 4.24 to 30.86 % when using the Natural Ventilation and between 8.99 to 43.19% when using the Split System. The difference of airborne infection risk between the Split System and the Natural Ventilation ranged from 47 to 56%. Opening windows and doors maximize Natural Ventilation so that the risk of airborne contagion is much lower than with Split System.
Resumo:
Exposure to water-damaged buildings and the associated health problems have evoked concern and created confusion during the past 20 years. Individuals exposed to moisture problem buildings report adverse health effects such as non-specific respiratory symptoms. Microbes, especially fungi, growing on the damp material have been considered as potential sources of the health problems encountered in these buildings. Fungi and their airborne fungal spores contain allergens and secondary metabolites which may trigger allergic as well as inflammatory types of responses in the eyes and airways. Although epidemiological studies have revealed an association between damp buildings and health problems, no direct cause-and-effect relationship has been established. Further knowledge is needed about the epidemiology and the mechanisms leading to the symptoms associated with exposure to fungi. Two different approaches have been used in this thesis in order to investigate the diverse health effects associated with exposure to moulds. In the first part, sensitization to moulds was evaluated and potential cross-reactivity studied in patients attending a hospital for suspected allergy. In the second part, one typical mould known to be found in water-damaged buildings and to produce toxic secondary metabolites was used to study the airway responses in an experimental model. Exposure studies were performed on both naive and allergen sensitized mice. The first part of the study showed that mould allergy is rare and highly dependent on the atopic status of the examined individual. The prevalence of sensitization was 2.7% to Cladosporium herbarum and 2.8% to Alternaria alternata in patients, the majority of whom were atopic subjects. Some of the patients sensitized to mould suffered from atopic eczema. Frequently the patients were observed to possess specific serum IgE antibodies to a yeast present in the normal skin flora, Pityrosporum ovale. In some of these patients, the IgE binding was partly found to be due to binding to shared glycoproteins in the mould and yeast allergen extracts. The second part of the study revealed that exposure to Stachybotrys chartarum spores induced an airway inflammation in the lungs of mice. The inflammation was characterized by an influx of inflammatory cells, mainly neutrophils and lymphocytes, into the lungs but with almost no differences in airway responses seen between the satratoxin producing and non-satratoxin producing strain. On the other hand, when mice were exposed to S. chartarum and sensitized/challenged with ovalbumin the extent of the inflammation was markedly enhanced. A synergistic increase in the numbers of inflammatory cells was seen in BAL and severe inflammation was observed in the histological lung sections. In conclusion, the results in this thesis imply that exposure to moulds in water damaged buildings may trigger health effects in susceptible individuals. The symptoms can rarely be explained by IgE mediated allergy to moulds. Other non-allergic mechanisms seem to be involved. Stachybotrys chartarum is one of the moulds potentially responsible for health problems. In this thesis, new reaction models for the airway inflammation induced by S. chartarum have been found using experimental approaches. The immunological status played an important role in the airway inflammation, enhancing the effects of mould exposure. The results imply that sensitized individuals may be more susceptible to exposure to moulds than non-sensitized individuals.
Resumo:
We investigate the ability of a global atmospheric general circulation model (AGCM) to reproduce observed 20 year return values of the annual maximum daily precipitation totals over the continental United States as a function of horizontal resolution. We find that at the high resolutions enabled by contemporary supercomputers, the AGCM can produce values of comparable magnitude to high quality observations. However, at the resolutions typical of the coupled general circulation models used in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, the precipitation return values are severely underestimated.
Resumo:
A compact model for noise margin (NM) of single-electron transistor (SET) logic is developed, which is a function of device capacitances and background charge (zeta). Noise margin is, then, used as a metric to evaluate the robustness of SET logic against background charge, temperature, and variation of SET gate and tunnel junction capacitances (CG and CT). It is shown that choosing alpha=CT/CG=1/3 maximizes the NM. An estimate of the maximum tolerable zeta is shown to be equal to plusmn0.03 e. Finally, the effect of mismatch in device parameters on the NM is studied through exhaustive simulations, which indicates that a isin [0.3, 0.4] provides maximum robustness. It is also observed that mismatch can have a significant impact on static power dissipation.
Resumo:
Carrier phase ambiguity resolution over long baselines is challenging in BDS data processing. This is partially due to the variations of the hardware biases in BDS code signals and its dependence on elevation angles. We present an assessment of satellite-induced code bias variations in BDS triple-frequency signals and the ambiguity resolutions procedures involving both geometry-free and geometry-based models. First, since the elevation of a GEO satellite remains unchanged, we propose to model the single-differenced fractional cycle bias with widespread ground stations. Second, the effects of code bias variations induced by GEO, IGSO and MEO satellites on ambiguity resolution of extra-wide-lane, wide-lane and narrow-lane combinations are analyzed. Third, together with the IGSO and MEO code bias variations models, the effects of code bias variations on ambiguity resolution are examined using 30-day data collected over the baselines ranging from 500 to 2600 km in 2014. The results suggest that although the effect of code bias variations on the extra-wide-lane integer solution is almost ignorable due to its long wavelength, the wide-lane integer solutions are rather sensitive to the code bias variations. Wide-lane ambiguity resolution success rates are evidently improved when code bias variations are corrected. However, the improvement of narrow-lane ambiguity resolution is not obvious since it is based on geometry-based model and there is only an indirect impact on the narrow-lane ambiguity solutions.
Resumo:
This paper describes a concept for a collision avoidance system for ships, which is based on model predictive control. A finite set of alternative control behaviors are generated by varying two parameters: offsets to the guidance course angle commanded to the autopilot and changes to the propulsion command ranging from nominal speed to full reverse. Using simulated predictions of the trajectories of the obstacles and ship, compliance with the Convention on the International Regulations for Preventing Collisions at Sea and collision hazards associated with each of the alternative control behaviors are evaluated on a finite prediction horizon, and the optimal control behavior is selected. Robustness to sensing error, predicted obstacle behavior, and environmental conditions can be ensured by evaluating multiple scenarios for each control behavior. The method is conceptually and computationally simple and yet quite versatile as it can account for the dynamics of the ship, the dynamics of the steering and propulsion system, forces due to wind and ocean current, and any number of obstacles. Simulations show that the method is effective and can manage complex scenarios with multiple dynamic obstacles and uncertainty associated with sensors and predictions.