944 resultados para mixed verification methods
Resumo:
ABSTRACT The objective of this work was to study the distribution of values of the coefficient of variation (CV) in the experiments of papaya crop (Carica papaya L.) by proposing ranges to guide researchers in their evaluation for different characters in the field. The data used in this study were obtained by bibliographical review in Brazilian journals, dissertations and thesis. This study considered the following characters: diameter of the stalk, insertion height of the first fruit, plant height, number of fruits per plant, fruit biomass, fruit length, equatorial diameter of the fruit, pulp thickness, fruit firmness, soluble solids and internal cavity diameter, from which, value ranges were obtained for the CV values for each character, based on the methodology proposed by Garcia, Costa and by the standard classification of Pimentel-Gomes. The results obtained in this study indicated that ranges of CV values were different among various characters, presenting a large variation, which justifies the necessity of using specific evaluation range for each character. In addition, the use of classification ranges obtained from methodology of Costa is recommended.
Resumo:
This article jointly examines the differences of laboratory versions of the Dutch clock open auction, a sealed-bid auction to represent book building, and a two-stage sealed bid auction to proxy for the “competitive IPO”, a recent innovation used in a few European equity initial public offerings. We investigate pricing, seller allocation, and buyer welfare allocation efficiency and conclude that the book building emulation seems to be as price efficient as the Dutch auction, even after investor learning, whereas the competitive IPO is not price efficient, regardless of learning. The competitive IPO is the most seller allocative efficient method because it maximizes offer proceeds. The Dutch auction emerges as the most buyer welfare allocative efficient method. Underwriters are probably seeking pricing efficiency rather than seller or buyer welfare allocative efficiency and their discretionary pricing and allocation must be important since book building is prominent worldwide.
Resumo:
LUDA is a research project of Key Action 4 "City of Tomorrow & Cultural Heritage" of the programme "Energy, Environment and Sustainable Development" within the Fifth Framework Programme of the European Commission
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
In this paper, a mixed-integer nonlinear approach is proposed to support decision-making for a hydro power producer, considering a head-dependent hydro chain. The aim is to maximize the profit of the hydro power producer from selling energy into the electric market. As a new contribution to earlier studies, a risk aversion criterion is taken into account, as well as head-dependency. The volatility of the expected profit is limited through the conditional value-at-risk (CVaR). The proposed approach has been applied successfully to solve a case study based on one of the main Portuguese cascaded hydro systems.
Resumo:
This paper is on the problem of short-term hydro scheduling (STHS), particularly concerning head-dependent reservoirs under competitive environment. We propose a novel method, based on mixed-integer nonlinear programming (MINLP), for optimising power generation efficiency. This method considers hydroelectric power generation as a nonlinear function of water discharge and of the head. The main contribution of this paper is that discharge ramping constraints and start/stop of units are also considered, in order to obtain more realistic and feasible results. The proposed method has been applied successfully to solve two case studies based on Portuguese cascaded hydro systems, providing a higher profit at an acceptable computation time in comparison with classical optimisation methods based on mixed-integer linear programming (MILP).
Resumo:
This paper is on the problem of short-term hydro scheduling (STHS), particularly concerning a head-dependent hydro chain We propose a novel mixed-integer nonlinear programming (MINLP) approach, considering hydroelectric power generation as a nonlinear function of water discharge and of the head. As a new contribution to eat her studies, we model the on-off behavior of the hydro plants using integer variables, in order to avoid water discharges at forbidden areas Thus, an enhanced STHS is provided due to the more realistic modeling presented in this paper Our approach has been applied successfully to solve a test case based on one of the Portuguese cascaded hydro systems with a negligible computational time requirement.
Resumo:
: A new active-contraction visco-elastic numerical model of the pelvic floor (skeletal) muscle is presented. Our model includes all elements that represent the muscle constitutive behavior, contraction and relaxation. In contrast with the previous models, the activation function can be null. The complete equations are shown and exactly linearized. Small verification and validation tests are performed and the pelvis is modeled using the data from the intra-abdominal pressure tests
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.
Resumo:
In this work, 14 primary schools of Lisbon city, Portugal, followed a questionnaire of the ISAAC - International Study of Asthma and Allergies in Childhood Program, in 2009/2010. The questionnaire contained questions to identify children with respiratory diseases (wheeze, asthma and rhinitis). Total particulate matter (TPM) was passively collected inside two classrooms of each of 14 primary schools. Two types of filter matrices were used to collect TPM: Millipore (IsoporeTM) polycarbonate and quartz. Three campaigns were selected for the measurement of TPM: Spring, Autumn and Winter. The highest difference between the two types of filters is that the mass of collected particles was higher in quartz filters than in polycarbonate filters, even if their correlation is excellent. The highest TPM depositions occurred between October 2009 and March 2010, when related with rhinitis proportion. Rhinitis was found to be related to TPM when the data were grouped seasonally and averaged for all the schools. For the data of 2006/2007, the seasonal variation was found to be related to outdoor particle deposition (below 10 μm).
Resumo:
This paper is on the problem of short-term hydro, scheduling, particularly concerning head-dependent cascaded hydro systems. We propose a novel mixed-integer quadratic programming approach, considering not only head-dependency, but also discontinuous operating regions and discharge ramping constraints. Thus, an enhanced short-term hydro scheduling is provided due to the more realistic modeling presented in this paper. Numerical results from two case studies, based on Portuguese cascaded hydro systems, illustrate the proficiency of the proposed approach.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.