975 resultados para Curves, Algebraic.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cum ./LSTA_A_8828879_O_XML_IMAGES/LSTA_A_8828879_O_ILM0001.gif rule [Singh (1975)] has been suggested in the literature for finding approximately optimum strata boundaries for proportional allocation, when the stratification is done on the study variable. This paper shows that for the class of density functions arising from the Wang and Aggarwal (1984) representation of the Lorenz Curve (or DBV curves in case of inventory theory), the cum ./LSTA_A_8828879_O_XML_IMAGES/LSTA_A_8828879_O_ILM0002.gif rule in place of giving approximately optimum strata boundaries, yields exactly optimum boundaries. It is also shown that the conjecture of Mahalanobis (1952) “. . .an optimum or nearly optimum solutions will be obtained when the expected contribution of each stratum to the total aggregate value of Y is made equal for all strata” yields exactly optimum strata boundaries for the case considered in the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computation of the dependency basis is the fundamental step in solving the implication problem for MVDs in relational database theory. We examine this problem from an algebraic perspective. We introduce the notion of the inference basis of a set M of MVDs and show that it contains the maximum information about the logical consequences of M. We propose the notion of an MVD-lattice and develop an algebraic characterization of the inference basis using simple notions from lattice theory. We also establish several properties of MVD-lattices related to the implication problem. Founded on our characterization, we synthesize efficient algorithms for (a) computing the inference basis of a given set M of MVDs; (b) computing the dependency basis of a given attribute set w.r.t. M; and (c) solving the implication problem for MVDs. Finally, we show that our results naturally extend to incorporate FDs also in a way that enables the solution of the implication problem for both FDs and MVDs put together.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The LISA Parameter Estimation Taskforce was formed in September 2007 to provide the LISA Project with vetted codes, source distribution models and results related to parameter estimation. The Taskforce's goal is to be able to quickly calculate the impact of any mission design changes on LISA's science capabilities, based on reasonable estimates of the distribution of astrophysical sources in the universe. This paper describes our Taskforce's work on massive black-hole binaries (MBHBs). Given present uncertainties in the formation history of MBHBs, we adopt four different population models, based on (i) whether the initial black-hole seeds are small or large and (ii) whether accretion is efficient or inefficient at spinning up the holes. We compare four largely independent codes for calculating LISA's parameter-estimation capabilities. All codes are based on the Fisher-matrix approximation, but in the past they used somewhat different signal models, source parametrizations and noise curves. We show that once these differences are removed, the four codes give results in extremely close agreement with each other. Using a code that includes both spin precession and higher harmonics in the gravitational-wave signal, we carry out Monte Carlo simulations and determine the number of events that can be detected and accurately localized in our four population models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing accurate and reliable crop detection algorithms is an important step for harvesting automation in horticulture. This paper presents a novel approach to visual detection of highly-occluded fruits. We use a conditional random field (CRF) on multi-spectral image data (colour and Near-Infrared Reflectance, NIR) to model two classes: crop and background. To describe these two classes, we explore a range of visual-texture features including local binary pattern, histogram of oriented gradients, and learn auto-encoder features. The pro-posed methods are evaluated using hand-labelled images from a dataset captured on a commercial capsicum farm. Experimental results are presented, and performance is evaluated in terms of the Area Under the Curve (AUC) of the precision-recall curves.Our current results achieve a maximum performance of 0.81AUC when combining all of the texture features in conjunction with colour information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a maximum likelihood method for estimating growth parameters for an aquatic species that incorporates growth covariates, and takes into consideration multiple tag-recapture data. Individual variability in asymptotic length, age-at-tagging, and measurement error are also considered in the model structure. Using distribution theory, the log-likelihood function is derived under a generalised framework for the von Bertalanffy and Gompertz growth models. Due to the generality of the derivation, covariate effects can be included for both models with seasonality and tagging effects investigated. Method robustness is established via comparison with the Fabens, improved Fabens, James and a non-linear mixed-effects growth models, with the maximum likelihood method performing the best. The method is illustrated further with an application to blacklip abalone (Haliotis rubra) for which a strong growth-retarding tagging effect that persisted for several months was detected

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Magnetic resonance studies reveal a marked difference between the binding of α-tocopherol and that of the corresponding acetate (vitamin E acetate) with dipalmitoylphosphatidylcholine (DPPC) vesicles. This is reflected in differences in the phase-transition curves of the DPPC vesicles incorporated with the two compounds, as well as in the 13C relaxation times and line widths. A model for the incorporation of these molecules in lipid bilayers has been suggested. α-Tocopherol binds strongly with the lipids, possibly through a hydrogen bond formation between the hydroxyl group of the former and one of the oxygen atoms of the latter. The possibility of such a hydrogen bond formation is excluded in vitamin E acetate, which binds loosely through the normal hydrophobic interaction. The model for lipid-vitamin interaction explains the in vitro decomposition of H2O2 by α-tocopherol. α-Tocopherol in conjuction with H2O2 can also act as a free-radical scavenger in the lipid phase. The incorporation of α-tocopherol and vitamin E acetate in DPPC vesicles enhances the permeability of lipid bilayers for small molecules such as sodium ascorbate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new technique has been devised to achieve a steady-state polarisation of a stationary electrode with a helical shaft rotating coaxial to it. A simplified theory for the convective hydrodynamics prevalent under these conditions has been formulated. Experimental data are presented to verify the steady-state character of the current-potential curves and the predicted dependence of the limiting current on the rotation speed of the rotor, the bulk concentration of the depolariser and the viscosity of the solution. Promising features of the multiple-segment electrodes concentric to a central disc electrode are pointed out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note is concerned with the problem of determining approximate solutions of Fredholm integral equations of the second kind. Approximating the solution of a given integral equation by means of a polynomial, an over-determined system of linear algebraic equations is obtained involving the unknown coefficients, which is finally solved by using the least-squares method. Several examples are examined in detail. (c) 2009 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The kinetics of decomposition of the carbonate Sr2Zr2O5CO3, are greatly influenced by the thermal effects during its formation. (α−t) curves are found to be sigmoidal and they could be analysed based on power law equations followed by first-order decay. The presence of carbon in the vacuum-prepared sample of carbonate has a strong deactivating effect. The carbonate is fairly crystalline and its decomposition leads to the formation of crystalline strontium zirconate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Study Design Retrospective review of prospectively collected data. Objectives To analyze intervertebral (IV) fusion after thoracoscopic anterior spinal fusion (TASF) and explore the relationship between fusion scores and key clinical variables. Summary of Background Information TASF provides comparable correction with some advantages over posterior approaches but reported mechanical complications, and their relationship to non-union and graft material is unclear. Similarly, the optimal combination of graft type and implant stiffness for effecting successful radiologic union remains undetermined. Methods A subset of patients from a large single-center series who had TASF for progressive scoliosis underwent low-dose computed tomographic scans 2 years after surgery. The IV fusion mass in the disc space was assessed using the 4-point Sucato scale, where 1 indicates <50% and 4 indicates 100% bony fusion of the disc space. The effects of rod diameter, rod material, graft type, fusion level, and mechanical complications on fusion scores were assessed. Results Forty-three patients with right thoracic major curves (mean age 14.9 years) participated in the study. Mean fusion scores for patient subgroups ranged from 1.0 (IV levels with rod fractures) to 2.2 (4.5-mm rod with allograft), with scores tending to decrease with increasing rod size and stiffness. Graft type (autograft vs. allograft) did not affect fusion scores. Fusion scores were highest in the middle levels of the rod construct (mean 2.52), dropping off by 20% to 30% toward the upper and lower extremities of the rod. IV levels where a rod fractured had lower overall mean fusion scores compared to levels without a fracture. Mean total Scoliosis Research Society (SRS) questionnaire scores were 98.9 from a possible total of 120, indicating a good level of patient satisfaction. Conclusions Results suggest that 100% radiologic fusion of the entire disc space is not necessary for successful clinical outcomes following thoracoscopic anterior selective thoracic fusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Experimental results are presented of ionisation (a)a nd electron attachment ( v ) coefficients evaluated from the steady-state Townsend curregnrto wth curves for SFsN2 and CC12FrN2 mixtures over the range 60 S E/P 6 240 (where E is the electric field in V cm" and P is the pressure in Torr reduced to 20'C). In both the mixtures the attachment coefficients (vmu) evaluated were found to follow the relationship; where 7 is the attachment coefficient of pure electronegative gas, F is the fraction of the electronegative gas in the mixture and /3 is a constant. The ionisation coefficients (amlx) generally obeyed the relationship where w2a nd aAa re thei onisation coefficients of nitrogen and the attachinggraess pectively. However, in case of CC12FrN2 mixtures, there were maxima in the a,,,v,a,l ues for CCI2F2 concentrations varying between 10% and 30% at all values of E/P investigated. Effective ionisation coefficients (a - p)/P obtained in these binary mixtures show that the critical E/P (corresponding to (a - q)/P = 0) increases with increase in the concentration of the electronegative gas up to 40%. Further increase in the electronegative gas content does not seem to alter the critical E/P.