966 resultados para alternative methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Fear and anxiety are part of all human experiences and they may contribute directly to a patient's behavior. The Atraumatic Restorative Treatment (ART) is a technique that may be an alternative approach in treating special care patients or those who suffer fear or anxiety. OBJECTIVE the aim of this paper is to review the ART technique as an alternative to reduce pain and fear during dental treatment. MATERIAL AND METHODS A search for the term "atraumatic restorative treatment" was carried out in the MEDLINE search engine. References, from the last 10 years, containing at least one of the terms: "psychological aspects", "discomfort", "fear", "anxiety" or "pain", were selected. RESULTS A total of 120 references were found, from which only 17 fit the criteria. Discussion: All authors agreed that the ART promotes less discomfort for patients, contributing to a reduction of anxiety and fear during the dental treatment. Results also indicated that ART minimizes pain reported by patients. CONCLUSIONS The ART approach can be considered as having favorable characteristics for the patient, promoting an "atraumatic" treatment. This technique may be indicated for patients who suffer from fear or anxiety towards dental treatments and whose behavior may cause the treatment to become unfeasible or even impossible altogether.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Causal inference with a continuous treatment is a relatively under-explored problem. In this dissertation, we adopt the potential outcomes framework. Potential outcomes are responses that would be seen for a unit under all possible treatments. In an observational study where the treatment is continuous, the potential outcomes are an uncountably infinite set indexed by treatment dose. We parameterize this unobservable set as a linear combination of a finite number of basis functions whose coefficients vary across units. This leads to new techniques for estimating the population average dose-response function (ADRF). Some techniques require a model for the treatment assignment given covariates, some require a model for predicting the potential outcomes from covariates, and some require both. We develop these techniques using a framework of estimating functions, compare them to existing methods for continuous treatments, and simulate their performance in a population where the ADRF is linear and the models for the treatment and/or outcomes may be misspecified. We also extend the comparisons to a data set of lottery winners in Massachusetts. Next, we describe the methods and functions in the R package causaldrf using data from the National Medical Expenditure Survey (NMES) and Infant Health and Development Program (IHDP) as examples. Additionally, we analyze the National Growth and Health Study (NGHS) data set and deal with the issue of missing data. Lastly, we discuss future research goals and possible extensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this study is to present an alternative benchmarking approach that can be used by national regulators of utilities. It is widely known that the lack of sizeable data sets limits the choice of the benchmarking method and the specification of the model to set price controls within incentive-based regulation. Ill-posed frontier models are the problem that some national regulators have been facing. Maximum entropy estimators are useful in the estimation of such ill-posed models, in particular in models exhibiting small sample sizes, collinearity and non-normal errors, as well as in models where the number of parameters to be estimated exceeds the number of observations available. The empirical study involves a sample data used by the Portuguese regulator of the electricity sector to set the parameters for the electricity distribution companies in the regulatory period of 2012-2014. DEA and maximum entropy methods are applied and the efficiency results are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Water Framework Directive (WFD) establishes Environmental Quality Standards (EQS) in marine water for 34 priority substances. Among these substances, 25 are hydrophobic and bioaccumulable (2 metals and 23 organic compounds). For these 25 substances, monitoring in water matrix is not appropriate and an alternative matrix should be developed. Bivalve mollusks, particularly mussels (Mytilus edulis, Mytilus galloprovincialis), are used by Ifremer as a quantitative biological indicator since 1979 in France, to assess the marine water quality. This study has been carried out in order to determine thresholds in mussels at least as protective as EQS in marine water laid down by the WFD. Three steps are defined: - Provide an overview of knowledges about the relations between the concentrations of contaminants in the marine water and mussels through bioaccumulation factor (BAF) and bioconcentration factor (BCF). This allows to examine how a BCF or a BAF can be determined: BCF can be determined experimentally (according to US EPA or ASTM standards), or by Quantitative Activity-Structure Relationship models (QSAR): four equations can be used for mussels. BAF can be determined by field experiment; but none standards exists. It could be determined by using QSAR but this method is considered as invalid for mussels, or by using existing model: Dynamic Budget Model, but this is complex to use. - Collect concentrations data in marine water (Cwater) in bibliography for those 25 substances; and compare them with concentration in mussels (Cmussels) obtained through French monitoring network of chemicals contaminants (ROCCH) and biological integrator network RINBIO. According to available data, this leads to determine the BAF or the BCF (Cmussels /Cwater) with field data. - Compare BAF and BCF values (when available) obtained with various methods for these substances: BCF (stemming from the bibliography, using experimental process), BCF calculated by QSAR and BAF determined using field data. This study points out that experimental BCF data are available for 3 substances (Chlorpyrifos, HCH, Pentachlorobenzene). BCF by QSAR can be calculated for 20 substances. The use of field data allows to evaluate 4 BAF for organic compounds and 2 BAF for metals. Using these BAF or BCF value, thresholds in shellfish can be determined as an alternative to EQS in marine water.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although mitigating GHG emissions is necessary to reduce the overall negative climate change impacts on crop yields and agricultural production, certain mitigation measures may generate unintended consequences to food availability and access due to land use competition and economic burden of mitigation. Prior studies have examined the co-impacts on food availability and global producer prices caused by alternative climate policies. More recent studies have looked at the reduction in total caloric intake driven by both changing income and changing food prices under one specific climate policy. However, due to inelastic calorie demand, consumers’ well-being are likely further reduced by increased food expenditures. Built upon existing literature, my dissertation explores how alternative climate policy designs might adversely affect both caloric intake and staple food budget share to 2050, by using the Global Change Assessment Model (GCAM) and a post-estimated metric of food availability and access (FAA). My dissertation first develop a set of new metrics and methods to explore new perspectives of food availability and access under new conditions. The FAA metric consists of two components, the fraction of GDP per capita spent on five categories of staple food and total caloric intake relative to a reference level. By testing the metric against alternate expectations of the future, it shows consistent results with previous studies that economic growth dominates the improvement of FAA. As we increase our ambition to achieve stringent climate targets, two policy conditions tend to have large impacts on FAA driven by competing land use and increasing food prices. Strict conservation policies leave the competition between bioenergy and agriculture production on existing commercial land, while pricing terrestrial carbon encourages large-scale afforestation. To avoid unintended outcomes to food availability and access for the poor, pricing land emissions in frontier forests has the advantage of selecting more productive land for agricultural activities compared to the full conservation approach, but the land carbon price should not be linked to the price of energy system emissions. These results are highly relevant to effective policy-making to reduce land use change emissions, such as the Reduced Emissions from Deforestation and Forest Degradation (REDD).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For climate risk management, cumulative distribution functions (CDFs) are an important source of information. They are ideally suited to compare probabilistic forecasts of primary (e.g. rainfall) or secondary data (e.g. crop yields). Summarised as CDFs, such forecasts allow an easy quantitative assessment of possible, alternative actions. Although the degree of uncertainty associated with CDF estimation could influence decisions, such information is rarely provided. Hence, we propose Cox-type regression models (CRMs) as a statistical framework for making inferences on CDFs in climate science. CRMs were designed for modelling probability distributions rather than just mean or median values. This makes the approach appealing for risk assessments where probabilities of extremes are often more informative than central tendency measures. CRMs are semi-parametric approaches originally designed for modelling risks arising from time-to-event data. Here we extend this original concept beyond time-dependent measures to other variables of interest. We also provide tools for estimating CDFs and surrounding uncertainty envelopes from empirical data. These statistical techniques intrinsically account for non-stationarities in time series that might be the result of climate change. This feature makes CRMs attractive candidates to investigate the feasibility of developing rigorous global circulation model (GCM)-CRM interfaces for provision of user-relevant forecasts. To demonstrate the applicability of CRMs, we present two examples for El Ni ? no/Southern Oscillation (ENSO)-based forecasts: the onset date of the wet season (Cairns, Australia) and total wet season rainfall (Quixeramobim, Brazil). This study emphasises the methodological aspects of CRMs rather than discussing merits or limitations of the ENSO-based predictors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate the comparative efficiency of graphite furnace atomic absorption spectrometry (GFAAS) and hydride generation atomic absorption spectrometry (HGAAS) for trace analysis of arsenic (As) in natural herbal products (NHPs). Method: Arsenic analysis in natural herbal products and standard reference material was conducted using atomic absorption spectrometry (AAS), namely, hydride generation AAS (HGAAS) and graphite furnace (GFAAS). The samples were digested with HNO3–H2O2 in a ratio of 4:1 using microwaveassisted acid digestion. The methods were validated with the aid of the standard reference material 1515 Apple Leaves (SRM) from NIST Results: Mean recovery of three different samples of NHPs, using HGAAS and GFAAS, ranged from 89.3 - 91.4 %, and 91.7 - 93.0 %, respectively. The difference between the two methods was insignificant. A (P= 0.5), B (P=0.4) and C (P=0.88) Relative standard deviation (RSD) RSD, i.e., precision was 2.5 - 6.5 % and 2.3 - 6.7 % using HGAAS and GFAAS techniques, respectively. Recovery of arsenic in SRM was 98 and 102 % by GFAAS and HGAAS, respectively. Conclusion: GFAAS demonstrates acceptable levels of precision and accuracy. Both techniques possess comparable accuracy and repeatability. Thus, the two methods are recommended as an alternative approach for trace analysis of arsenic in natural herbal products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare efficacy and safety of primaquine regimens currently used to prevent relapses by Plasmodium vivax. Methods: A systematic review was carried out to identify clinical trials evaluating efficacy and safety to prevent malaria recurrences by P. vivax of primaquine regimen 0.5 mg/kg/day for 7 or 14 days compared to standard regimen of 0.25 mg/kg/day for 14 days. Efficacy of primaquine according to cumulative incidence of recurrences after 28 days was determined. The overall relative risk with fixed-effects meta-analysis was estimated. Results: For the regimen 0.5 mg/kg/day/7 days were identified 7 studies, which showed an incidence of recurrence between 0% and 20% with follow-up 60-210 days; only 4 studies comparing with the standard regimen 0.25 mg/kg/day/14 days and no difference in recurrences between both regimens (RR= 0.977, 95% CI= 0.670 to 1.423) were found. 3 clinical trials using regimen 0.5 mg/kg/day/14 days with an incidence of recurrences between 1.8% and 18.0% during 330-365 days were identified; only one study comparing with the standard regimen (RR= 0.846, 95% CI= 0.484 to 1.477). High risk of bias and differences in handling of included studies were found. Conclusion: Available evidence is insufficient to determine whether currently PQ regimens used as alternative rather than standard treatment have better efficacy and safety in preventing relapse of P. vivax. Clinical trials are required to guide changes in treatment regimen of malaria vivax.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare efficacy and safety of primaquine regimens currently used to prevent relapses by Plasmodium vivax. Methods: A systematic review was carried out to identify clinical trials evaluating efficacy and safety to prevent malaria recurrences by P. vivax of primaquine regimen 0.5 mg/kg/day for 7 or 14 days compared to standard regimen of 0.25 mg/kg/day for 14 days. Efficacy of primaquine according to cumulative incidence of recurrences after 28 days was determined. The overall relative risk with fixed-effects meta-analysis was estimated. Results: For the regimen 0.5 mg/kg/day/7 days were identified 7 studies, which showed an incidence of recurrence between 0% and 20% with follow-up 60-210 days; only 4 studies comparing with the standard regimen 0.25 mg/kg/day/14 days and no difference in recurrences between both regimens (RR= 0.977, 95% CI= 0.670 to 1.423) were found. 3 clinical trials using regimen 0.5 mg/kg/day/14 days with an incidence of recurrences between 1.8% and 18.0% during 330-365 days were identified; only one study comparing with the standard regimen (RR= 0.846, 95% CI= 0.484 to 1.477). High risk of bias and differences in handling of included studies were found. Conclusion: Available evidence is insufficient to determine whether currently PQ regimens used as alternative rather than standard treatment have better efficacy and safety in preventing relapse of P. vivax. Clinical trials are required to guide changes in treatment regimen of malaria vivax.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally, densities of newly built roadways are checked by direct sampling (cores) or by nuclear density gauge measurements. For roadway engineers, density of asphalt pavement surfaces is essential to determine pavement quality. Unfortunately, field measurements of density by direct sampling or by nuclear measurement are slow processes. Therefore, I have explored the use of rapidly-deployed ground penetrating radar (GPR) as an alternative means of determining pavement quality. The dielectric constant of pavement surface may be a substructure parameter that correlates with pavement density, and can be used as a proxy when density of asphalt is not known from nuclear or destructive methods. The dielectric constant of the asphalt can be determined using ground penetrating radar (GPR). In order to use GPR for evaluation of road surface quality, the relationship between dielectric constants of asphalt and their densities must be established. Field measurements of GPR were taken at four highway sites in Houghton and Keweenaw Counties, Michigan, where density values were also obtained using nuclear methods in the field. Laboratory studies involved asphalt samples taken from the field sites and samples created in the laboratory. These were tested in various ways, including, density, thickness, and time domain reflectometry (TDR). In the field, GPR data was acquired using a 1000 MHz air-launched unit and a ground-coupled unit at 200 and 500 MHz. The equipment used was owned and operated by the Michigan Department of Transportation (MDOT) and available for this study for a total of four days during summer 2005 and spring 2006. The analysis of the reflected waveforms included “routine” processing for velocity using commercial software and direct evaluation of reflection coefficients to determine a dielectric constant. The dielectric constants computed from velocities do not agree well with those obtained from reflection coefficients. Perhaps due to the limited range of asphalt types studied, no correlation between density and dielectric constant was evident. Laboratory measurements were taken with samples removed from the field and samples created for this study. Samples from the field were studied using TDR, in order to obtain dielectric constant directly, and these correlated well with the estimates made from reflection coefficients. Samples created in the laboratory were measured using 1000 MHz air-launched GPR, and 400 MHz ground-coupled GPR, each under both wet and dry conditions. On the basis of these observations, I conclude that dielectric constant of asphalt can be reliably measured from waveform amplitude analysis of GJPR data, based on the consistent agreement with that obtained in the laboratory using TDR. Because of the uniformity of asphalts studied here, any correlation between dielectric constant and density is not yet apparent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation is the first project on the Haitian Platform for Advocacy for an Alternative Development- PAPDA, a nation-building coalition founded by activists from varying sectors to coordinate one comprehensive nationalist movement against what they are calling an Occupation. My work not only provides information on this under-theorized popular movement but also situates it within the broader literature on the postcolonial nation-state as well as Latin American and Caribbean social movements. The dissertation analyzes the contentious relationship between local and global discourses and practices of citizenship. Furthermore, the research draws on transnational feminist theory to underline the scattered hegemonies that intersect to produce varied spaces and practices of sovereignty within the Haitian postcolonial nation-state. The dissertation highlights how race and class, gender and sexuality, education and language, and religion have been imagined and co-constituted by Haitian social movements in constructing ‘new’ collective identities that collapse the private and the public, the rural and the urban, the traditional and the modern. My project complements the scholarship on social movements and the postcolonial nation-state and pushes it forward by emphasizing its spatial dimensions. Moreover, the dissertation de-centers the state to underline the movement of capital, goods, resources, and populations that shape the postcolonial experience. I re-define the postcolonial nation-state as a network of local, regional, international, and transnational arrangements between different political agents, including social movement actors. To conduct this interdisciplinary research project, I employed ethnographic methods, discourse and textual analysis, as well as basic mapping and statistical descriptions in order to present a historically-rooted interpretation of individual and organizational negotiations for community-based autonomy and regional development. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent marine long-offset transient electromagnetic (LOTEM) measurements yielded the offshore delineation of a fresh groundwater body beneath the seafloor in the region of Bat Yam, Israel. The LOTEM application was effective in detecting this freshwater body underneath the Mediterranean Sea and allowed an estimation of its seaward extent. However, the measured data set was insufficient to understand the hydrogeological configuration and mechanism controlling the occurrence of this fresh groundwater discovery. Especially the lateral geometry of the freshwater boundary, important for the hydrogeological modelling, could not be resolved. Without such an understanding, a rational management of this unexploited groundwater reservoir is not possible. Two new high-resolution marine time-domain electromagnetic methods are theoretically developed to derive the hydrogeological structure of the western aquifer boundary. The first is called Circular Electric Dipole (CED). It is the land-based analogous of the Vertical Electric Dipole (VED), which is commonly applied to detect resistive structures in the subsurface. Although the CED shows exceptional detectability characteristics in the step-off signal towards the sub-seafloor freshwater body, an actual application was not carried out in the extent of this study. It was found that the method suffers from an insufficient signal strength to adequately delineate the resistive aquifer under realistic noise conditions. Moreover, modelling studies demonstrated that severe signal distortions are caused by the slightest geometrical inaccuracies. As a result, a successful application of CED in Israel proved to be rather doubtful. A second method called Differential Electric Dipole (DED) is developed as an alternative to the intended CED method. Compared to the conventional marine time-domain electromagnetic system that commonly applies a horizontal electric dipole transmitter, the DED is composed of two horizontal electric dipoles in an in-line configuration that share a common central electrode. Theoretically, DED has similar detectability/resolution characteristics compared to the conventional LOTEM system. However, the superior lateral resolution towards multi-dimensional resistivity structures make an application desirable. Furthermore, the method is less susceptible towards geometrical errors making an application in Israel feasible. In the extent of this thesis, the novel marine DED method is substantiated using several one-dimensional (1D) and multi-dimensional (2D/3D) modelling studies. The main emphasis lies on the application in Israel. Preliminary resistivity models are derived from the previous marine LOTEM measurement and tested for a DED application. The DED method is effective in locating the two-dimensional resistivity structure at the western aquifer boundary. Moreover, a prediction regarding the hydrogeological boundary conditions are feasible, provided a brackish water zone exists at the head of the interface. A seafloor-based DED transmitter/receiver system is designed and built at the Institute of Geophysics and Meteorology at the University of Cologne. The first DED measurements were carried out in Israel in April 2016. The acquired data set is the first of its kind. The measured data is processed and subsequently interpreted using 1D inversion. The intended aim of interpreting both step-on and step-off signals failed, due to the insufficient data quality of the latter. Yet, the 1D inversion models of the DED step-on signals clearly detect the freshwater body for receivers located close to the Israeli coast. Additionally, a lateral resistivity contrast is observable in the 1D inversion models that allow to constrain the seaward extent of this freshwater body. A large-scale 2D modelling study followed the 1D interpretation. In total, 425 600 forward calculations are conducted to find a sub-seafloor resistivity distribution that adequately explains the measured data. The results indicate that the western aquifer boundary is located at 3600 m - 3700 m before the coast. Moreover, a brackish water zone of 3 Omega*m to 5 Omega*m with a lateral extent of less than 300 m is likely located at the head of the freshwater aquifer. Based on these results, it is predicted that the sub-seafloor freshwater body is indeed open to the sea and may be vulnerable to seawater intrusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A non-linear least-squares methodology for simultaneously estimating parameters of selectivity curves with a pre-defined functional form, across size classes and mesh sizes, using catch size frequency distributions, was developed based on the model of Kirkwood and Walker [Kirkwood, G.P., Walker, T.L, 1986. Gill net selectivities for gummy shark, Mustelus antarcticus Gunther, taken in south-eastern Australian waters. Aust. J. Mar. Freshw. Res. 37, 689-697] and [Wulff, A., 1986. Mathematical model for selectivity of gill nets. Arch. Fish Wiss. 37, 101-106]. Observed catches of fish of size class I in mesh m are modeled as a function of the estimated numbers of fish of that size class in the population and the corresponding selectivities. A comparison was made with the maximum likelihood methodology of [Kirkwood, G.P., Walker, T.I., 1986. Gill net selectivities for gummy shark, Mustelus antarcticus Gunther, taken in south-eastern Australian waters. Aust. J. Mar. Freshw. Res. 37, 689-697] and [Wulff, A., 1986. Mathematical model for selectivity of gill nets. Arch. Fish Wiss; 37, 101-106], using simulated catch data with known selectivity curve parameters, and two published data sets. The estimated parameters and selectivity curves were generally consistent for both methods, with smaller standard errors for parameters estimated by non-linear least-squares. The proposed methodology is a useful and accessible alternative which can be used to model selectivity in situations where the parameters of a pre-defined model can be assumed to be functions of gear size; facilitating statistical evaluation of different models and of goodness of fit. (C) 1998 Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In marginal lands Opuntia ficus-indica (OFI) could be used as an alternative fruit and forage crop. The plant vigour and the biomass production were evaluated in Portuguese germplasm (15 individuals from 16 ecotypes) by non-destructive methods, 2 years following planting in a marginal soil and dryland conditions. Two Italian cultivars (Gialla and Bianca) were included in the study for comparison purposes. The biomass production and the plant vigour were estimated by measuring the cladodes number and area, and the fresh (FW) and dry weight (DW) per plant. We selected linear models by using the biometric data from 60 cladodes to predict the cladode area, the FW and the DW per plant. Among ecotypes, significant differences were found in the studied biomass-related parameters and several homogeneous groups were established. Four Portuguese ecotypes had higher biomass production than the others, 3.20 Mg ha−1 on average, a value not significantly different to the improved ‘Gialla’ cultivar, which averaged 3.87 Mg ha−1. Those ecotypes could be used to start a breeding program and to deploy material for animal feeding and fruit production.