855 resultados para quantifying heteroskedasticity
Resumo:
Aggression is a complex behavior that influences social relationships and can be seen as adaptive or maladaptive depending on the context and intensity of expression. A model organism suitable for genetic dissection of the underlying neural mechanisms of aggressive behavior is still needed. Zebrafish has already proven to be a powerful vertebrate model organism for the study of normal and pathological brain function. Despite the fact that zebrafish is a gregarious species that forms shoals, when allowed to interact in pairs, both males and females express aggressive behavior and establish dominance hierarchies. Here, we describe two protocols that can be used to quantify aggressive behavior in zebrafish, using two different paradigms: (1) staged fights between real opponents and (2) mirror-elicited fights. We also discuss the methodology for the behavior analysis, the expected results for both paradigms, and the advantages and disadvantages of each paradigm in face of the specific goals of the study.
Resumo:
Transportation system resilience has been the subject of several recent studies. To assess the resilience of a transportation network, however, it is essential to model its interactions with and reliance on other lifelines. In this work, a bi-level, mixed-integer, stochastic program is presented for quantifying the resilience of a coupled traffic-power network under a host of potential natural or anthropogenic hazard-impact scenarios. A two-layer network representation is employed that includes details of both systems. Interdependencies between the urban traffic and electric power distribution systems are captured through linking variables and logical constraints. The modeling approach was applied on a case study developed on a portion of the signalized traffic-power distribution system in southern Minneapolis. The results of the case study show the importance of explicitly considering interdependencies between critical infrastructures in transportation resilience estimation. The results also provide insights on lifeline performance from an alternative power perspective.
Resumo:
Recreational shore fishing along 250 km of the south and south-west coast of Portugal was studied based on roving creel and aerial surveys. Surveys were conducted between August 2006 and July 2007, following a stratified random-sampling design and provided information on catch and effort, harvest and discards, angler demographics and fishing habits. Overall, 192 roving creel surveys, 24 aerial surveys and 1321 interviews were conducted. Based on the aerial surveys, a mean +/- s.e. total fishing effort of 705 236 +/- 32 765 angler h year(-1) was estimated, corresponding to 166 430 +/- 9792 fishing trips year(-1). Average time spent per fishing trip was 4.7 h. A total of 48 species, belonging to 22 families, were recorded in roving creel surveys. The most important species was Diplodus sargus, accounting for 44% of the total catches by number and 48% by mass. Estimated mean +/- s.e. total annual recreational shore fishing catch was 160.2 +/- 12.6 t year(-1) (788 049 +/- 54 079 fishes year(-1)), of which 147.4 +/- 11.9 t year(-1) (589 132 +/- 42 360 fishes year(-1)) was retained. Although overall shore-based recreational catches only corresponded to 0.8% of the commercial landings (only common species considered), D. sargus catches by recreational shore anglers were considerable, corresponding to 65% of the commercial landings. The implications of these results for integrated fisheries management and conservation are discussed, and future research proposed.
Resumo:
Fishing trials with monofilament gill nets and longlines using small hooks were carried out in Algarve waters (southern Portugal) over a one-year period. Four hook sizes of "Mustad" brand, round bent, flatted sea hooks (Quality 2316 DT, numbers 15, 13, 12 and 11) and four mesh sizes of 25, 30, 35 and 40 mm (bar length) monofilament gill nets were used. Commercially valuable sea breams dominated the longline catches while small pelagics were relatively more important in the gill nets. Significant differences in the catch size frequency distributions of the two gears were found for all the most important species caught by both gears (Boops boops, Diplodus bellottii, Diplodus vulgaris, Pagellus acarne, Pagellus erythrinus, Spondyiosoma cantharus, Scomber japonicus and Scorpaena notata), with longlines catching larger fish and a wider size range than nets. Whereas longline catch size frequency distributions for most species for the different hook sizes were generally highly overlapped, suggesting little or no differences in size selectivity, gill net catch size frequency distributions clearly showed size selection. A variety of models were fitted to the gill net and hook data using the SELECT method, while the parameters of the logistic model were estimated by maximum likelihood for the longline data. The bi-normal model gave the best fits for most of the species caught with gill nets, while the logistic model adequately described hook selectivity. The results of this study show that the two static gears compete for many of the same species and have different impacts in terms of catch composition and size selectivity. This information will I;e useful for the improved management of these small-scale fisheries in which many different gears compete for scarce resources.
Resumo:
The high degree of variability and inconsistency in cash flow study usage by property professionals demands improvement in knowledge and processes. Until recently limited research was being undertaken on the use of cash flow studies in property valuations but the growing acceptance of this approach for major investment valuations has resulted in renewed interest in this topic. Studies on valuation variations identify data accuracy, model consistency and bias as major concerns. In cash flow studies there are practical problems with the input data and the consistency of the models. This study will refer to the recent literature and identify the major factors in model inconsistency and data selection. A detailed case study will be used to examine the effects of changes in structure and inputs. The key variable inputs will be identified and proposals developed to improve the selection process for these key variables. The variables will be selected with the aid of sensitivity studies and alternative ways of quantifying the key variables explained. The paper recommends, with reservations, the use of probability profiles of the variables and the incorporation of this data in simulation exercises. The use of Monte Carlo simulation is demonstrated and the factors influencing the structure of the probability distributions of the key variables are outline. This study relates to ongoing research into functional performance of commercial property within an Australian Cooperative Research Centre.
Resumo:
Knowledge of particle emission characteristics associated with forest fires and in general, biomass burning, is becoming increasingly important due to the impact of these emissions on human health. Of particular importance is developing a better understanding of the size distribution of particles generated from forest combustion under different environmental conditions, as well as provision of emission factors for different particle size ranges. This study was aimed at quantifying particle emission factors from four types of wood found in South East Queensland forests: Spotted Gum (Corymbia citriodora), Red Gum (Eucalypt tereticornis), Blood Gum (Eucalypt intermedia), and Iron bark (Eucalypt decorticans); under controlled laboratory conditions. The experimental set up included a modified commercial stove connected to a dilution system designed for the conditions of the study. Measurements of particle number size distribution and concentration resulting from the burning of woods with a relatively homogenous moisture content (in the range of 15 to 26 %) and for different rates of burning were performed using a TSI Scanning Mobility Particle Sizer (SMPS) in the size range from 10 to 600 nm and a TSI Dust Trak for PM2.5. The results of the study in terms of the relationship between particle number size distribution and different condition of burning for different species show that particle number emission factors and PM2.5 mass emission factors depend on the type of wood and the burning rate; fast burning or slow burning. The average particle number emission factors for fast burning conditions are in the range of 3.3 x 1015 to 5.7 x 1015 particles/kg, and for PM2.5 are in the range of 139 to 217 mg/kg.
Resumo:
Three anaerobic ponds used to store and treat piggery wastes were fully covered with permeable materials manufactured from polypropylene geofabric, polyethylene shade cloth and supported straw. The covers were assessed in terms of efficacy in reducing odour emission rates over a 40-month period. Odour samples were collected from the surface of the covers, the surface of the exposed liquor and from the surface of an uncovered (control) pond at one of the piggeries. Relative to the emission rate of the exposed liquor at each pond, the polypropylene, shade cloth and straw covers reduced average emission rates by 76%, 69% and 66% respectively. At the piggery with an uncovered control pond, the polypropylene covers reduced average odour emission rates by 50% and 41% respectively. A plausible hypothesis, consistent with likely mechanisms for the odour reduction and the olfactometric method used to quantifying the efficacy of the covers, is offered.
Resumo:
Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.
Resumo:
In recent years considerable effort has gone into quantifying the reuse and recycling potential of waste generated by residential construction. Unfortunately less information is available for the commercial refurbishment sector. It is hypothesised that significant economic and environmental benefit can be derived from closer monitoring of the commercial construction waste stream. With the aim of assessing these benefits, the authors are involved in ongoing case studies to record both current standard practice and the most effective means of improving the eco-efficiency of materials use in office building refurbishments. This paper focuses on the issues involved in developing methods for obtaining the necessary information on better waste management practices and establishing benchmark indicators. The need to create databases to establish benchmarks of waste minimisation best practice in commercial construction is stressed. Further research will monitor the delivery of case study projects and the levels of reuse and recycling achieved in directly quantifiable ways
Resumo:
The current policy decision making in Australia regarding non-health public investments (for example, transport/housing/social welfare programmes) does not quantify health benefits and costs systematically. To address this knowledge gap, this study proposes an economic model for quantifying health impacts of public policies in terms of dollar value. The intention is to enable policy-makers in conducting economic evaluation of health effects of non-health policies and in implementing policies those reduce health inequalities as well as enhance positive health gains of the target population. Health Impact Assessment (HIA) provides an appropriate framework for this study since HIA assesses the beneficial and adverse effects of a programme/policy on public health and on health inequalities through the distribution of those effects. However, HIA usually tries to influence the decision making process using its scientific findings, mostly epidemiological and toxicological evidence. In reality, this evidence can not establish causal links between policy and health impacts since it can not explain how an individual or a community reacts to changing circumstances. The proposed economic model addresses this health-policy linkage using a consumer choice approach that can explain changes in group and individual behaviour in a given economic set up. The economic model suggested in this paper links epidemiological findings with economic analysis to estimate the health costs and benefits of public investment policies. That is, estimating dollar impacts when health status of the exposed population group changes by public programmes – for example, transport initiatives to reduce congestion by building new roads/ highways/ tunnels etc. or by imposing congestion taxes. For policy evaluation purposes, the model is incorporated in the HIA framework by establishing association among identified factors, which drive changes in the behaviour of target population group and in turn, in the health outcomes. The economic variables identified to estimate the health inequality and health costs are levels of income, unemployment, education, age groups, disadvantaged population groups, mortality/morbidity etc. However, though the model validation using case studies and/or available database from Australian non-health policy (say, transport) arena is in the future tasks agenda, it is beyond the scope of this current paper.
Resumo:
Healthcare-associated methicillin-resistant Staphylococcus aureus(MRSA) infection may cause increased hospital stay or, sometimes, death. Quantifying this effect is complicated because it is a time-dependent exposure: infection may prolong hospital stay, while longer stays increase the risk of infection. We overcome these problems by using a multinomial longitudinal model for estimating the daily probability of death and discharge. We then extend the basic model to estimate how the effect of MRSA infection varies over time, and to quantify the number of excess ICU days due to infection. We find that infection decreases the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82), but is only indirectly associated with increased mortality. An infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% CI: 0.1, 0.5) for a patient with an APACHE II score of 10, and 1.2 days (95% CI: 0.5, 2.0) for a patient with an APACHE II score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection, but was slightly stronger closer to the start of infection. These results confirm the importance of MRSA infection in increasing ICU stay, but suggest that previous work may have systematically overestimated the effect size.
Resumo:
To navigate successfully in a previously unexplored environment, a mobile robot must be able to estimate the spatial relationships of the objects of interest accurately. A Simultaneous Localization and Mapping (SLAM) sys- tem employs its sensors to build incrementally a map of its surroundings and to localize itself in the map simultaneously. The aim of this research project is to develop a SLAM system suitable for self propelled household lawnmowers. The proposed bearing-only SLAM system requires only an omnidirec- tional camera and some inexpensive landmarks. The main advantage of an omnidirectional camera is the panoramic view of all the landmarks in the scene. Placing landmarks in a lawn field to define the working domain is much easier and more flexible than installing the perimeter wire required by existing autonomous lawnmowers. The common approach of existing bearing-only SLAM methods relies on a motion model for predicting the robot’s pose and a sensor model for updating the pose. In the motion model, the error on the estimates of object positions is cumulated due mainly to the wheel slippage. Quantifying accu- rately the uncertainty of object positions is a fundamental requirement. In bearing-only SLAM, the Probability Density Function (PDF) of landmark position should be uniform along the observed bearing. Existing methods that approximate the PDF with a Gaussian estimation do not satisfy this uniformity requirement. This thesis introduces both geometric and proba- bilistic methods to address the above problems. The main novel contribu- tions of this thesis are: 1. A bearing-only SLAM method not requiring odometry. The proposed method relies solely on the sensor model (landmark bearings only) without relying on the motion model (odometry). The uncertainty of the estimated landmark positions depends on the vision error only, instead of the combination of both odometry and vision errors. 2. The transformation of the spatial uncertainty of objects. This thesis introduces a novel method for translating the spatial un- certainty of objects estimated from a moving frame attached to the robot into the global frame attached to the static landmarks in the environment. 3. The characterization of an improved PDF for representing landmark position in bearing-only SLAM. The proposed PDF is expressed in polar coordinates, and the marginal probability on range is constrained to be uniform. Compared to the PDF estimated from a mixture of Gaussians, the PDF developed here has far fewer parameters and can be easily adopted in a probabilistic framework, such as a particle filtering system. The main advantages of our proposed bearing-only SLAM system are its lower production cost and flexibility of use. The proposed system can be adopted in other domestic robots as well, such as vacuum cleaners or robotic toys when terrain is essentially 2D.
Resumo:
Expert elicitation is the process of retrieving and quantifying expert knowledge in a particular domain. Such information is of particular value when the empirical data is expensive, limited, or unreliable. This paper describes a new software tool, called Elicitator, which assists in quantifying expert knowledge in a form suitable for use as a prior model in Bayesian regression. Potential environmental domains for applying this elicitation tool include habitat modeling, assessing detectability or eradication, ecological condition assessments, risk analysis, and quantifying inputs to complex models of ecological processes. The tool has been developed to be user-friendly, extensible, and facilitate consistent and repeatable elicitation of expert knowledge across these various domains. We demonstrate its application to elicitation for logistic regression in a geographically based ecological context. The underlying statistical methodology is also novel, utilizing an indirect elicitation approach to target expert knowledge on a case-by-case basis. For several elicitation sites (or cases), experts are asked simply to quantify their estimated ecological response (e.g. probability of presence), and its range of plausible values, after inspecting (habitat) covariates via GIS.
Resumo:
Background: The aim of this work is to develop a more complete qualitative and quantitative understanding of the in vivo histology of the human bulbar conjunctiva. Methods: Laser scanning confocal microscopy (LSCM) was used to observe and measure morphological characteristics of the bulbar conjunctiva of 11 healthy human volunteer subjects. Results: The superficial epithelial layer of the bulbar conjunctiva is seen as a mass of small cell nuclei. Cell borders are sometimes visible. The light grey borders of basal epithelial cells are clearly visible, but nuclei can not be seen. The conjunctival stroma is comprised of a dense meshwork of white fibres, through which traverse blood vessels containing cellular elements. Orifices at the epithelial surface may represent goblet cells that have opened and expelled their contents. Goblet cells are also observed in the deeper epithelial layers, as well as conjunctival microcysts and mature forms of Langerhans cells. The bulbar conjunctiva has a mean thickness of 32.9 1.1 mm, and a superficial and basal epithelial cell density of 2212 782 and 2368 741 cells/ mm2, respectively. Overall goblet and mature Langerhans cell densities are 111 58 and 23 25 cells/mm2, respectively. Conclusions: LSCM is a powerful technique for studying the human bulbar conjunctiva in vivo and quantifying key aspects of cell morphology. The observations presented here may serve as a useful marker against which changes in conjunctival morphology due to disease, surgery, drug therapy or contact lens wear can be assessed.