852 resultados para Initial data problem


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract- A Bayesian optimization algorithm for the nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. Unlike our previous work that used GAs to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. eventually, we will be able to identify and mix building blocks directly. The Bayesian optimization algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A Bayesian optimization algorithm for the nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse’s assignment. Unlike our previous work that used GAs to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. eventually, we will be able to identify and mix building blocks directly. The Bayesian optimization algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The fisheries for mackerel scad, Decapterus macarellus, are particularly important in Cape Verde, constituting almost 40% of total catches at the peak of the fishery in 1997 and 1998 ( 3700 tonnes). Catches have been stable at a much lower level of about 2 100 tonnes in recent years. Given the importance of mackerel scad in terms of catch weight and local food security, there is an urgent need for updated assessment. Stock assessment was carried out using a Bayesian approach to biomass dynamic modelling. In order to tackle the problem of a non-informative CPUE series, the intrinsic rate of increase, r, was estimated separately, and the ratio B-0/X, initial biomass relative to carrying capacity, was assumed based on available information. The results indicated that the current level of fishing is sustainable. The probability of collapse is low, particularly in the short-term, and it is likely that biomass may increase further above B-msy, indicating a healthy stock level. It would appear that it is relatively safe to increase catches even up to 4000 tonnes. However, the marginal posterior of r was almost identical to the prior, indicating that there is relatively low information content in CPUE. This was also the case in relation to B-0/X There have been substantial increases in fishing efficiency, which have not been adequately captured by the measure used for effort (days or trips), implying that the results may be overly optimistic and should be considered preliminary. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis investigates the legal, ethical, technical, and psychological issues of general data processing and artificial intelligence practices and the explainability of AI systems. It consists of two main parts. In the initial section, we provide a comprehensive overview of the big data processing ecosystem and the main challenges we face today. We then evaluate the GDPR’s data privacy framework in the European Union. The Trustworthy AI Framework proposed by the EU’s High-Level Expert Group on AI (AI HLEG) is examined in detail. The ethical principles for the foundation and realization of Trustworthy AI are analyzed along with the assessment list prepared by the AI HLEG. Then, we list the main big data challenges the European researchers and institutions identified and provide a literature review on the technical and organizational measures to address these challenges. A quantitative analysis is conducted on the identified big data challenges and the measures to address them, which leads to practical recommendations for better data processing and AI practices in the EU. In the subsequent part, we concentrate on the explainability of AI systems. We clarify the terminology and list the goals aimed at the explainability of AI systems. We identify the reasons for the explainability-accuracy trade-off and how we can address it. We conduct a comparative cognitive analysis between human reasoning and machine-generated explanations with the aim of understanding how explainable AI can contribute to human reasoning. We then focus on the technical and legal responses to remedy the explainability problem. In this part, GDPR’s right to explanation framework and safeguards are analyzed in-depth with their contribution to the realization of Trustworthy AI. Then, we analyze the explanation techniques applicable at different stages of machine learning and propose several recommendations in chronological order to develop GDPR-compliant and Trustworthy XAI systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract In this paper, we address the problem of picking a subset of bids in a general combinatorial auction so as to maximize the overall profit using the first-price model. This winner determination problem assumes that a single bidding round is held to determine both the winners and prices to be paid. We introduce six variants of biased random-key genetic algorithms for this problem. Three of them use a novel initialization technique that makes use of solutions of intermediate linear programming relaxations of an exact mixed integer-linear programming model as initial chromosomes of the population. An experimental evaluation compares the effectiveness of the proposed algorithms with the standard mixed linear integer programming formulation, a specialized exact algorithm, and the best-performing heuristics proposed for this problem. The proposed algorithms are competitive and offer strong results, mainly for large-scale auctions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study describes the sperm morphology of the mayfly Hexagenia (Pseudeatonica) albivitta (Ephemeroptera). Its spermatozoon measures approximately 30 μm of which 9 μm corresponds to the head. The head is composed of an approximately round acrosomal vesicle and a cylindrical nucleus. The nucleus has two concavities, one in the anterior tip, where the acrosomal vesicle is inserted and a deeper one at its base, where the flagellum components are inserted. The flagellum is composed of an axoneme, a mitochondrion and a dense rod adjacent to the mitochondrion. A centriolar adjunct is also observed surrounding the axoneme in the initial portion of the flagellum and extends along the flagellum for at least 2 μm, surrounding the axoneme in a half-moon shape. The axoneme is the longest component of the flagellum, and it follows the 9+9+0 pattern, with no central pair of microtubules. At the posterior region of the flagellum, the mitochondrion has a dumb-bell shape in cross sections that, together with the rectangular mitochondrial-associated rod, is responsible for the flattened shape of the flagellum. An internal membrane is observed surrounding both mitochondrion and its associated structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The ability to predict and understand which biomechanical properties of the cornea are responsible for the stability or progression of keratoconus may be an important clinical and surgical tool for the eye-care professional. We have developed a finite element model of the cornea, that tries to predicts keratoconus-like behavior and its evolution based on material properties of the corneal tissue. METHODS: Corneal material properties were modeled using bibliographic data and corneal topography was based on literature values from a schematic eye model. Commercial software was used to simulate mechanical and surface properties when the cornea was subject to different local parameters, such as elasticity. RESULTS: The simulation has shown that, depending on the corneal initial surface shape, changes in local material properties and also different intraocular pressures values induce a localized protuberance and increase in curvature when compared to the remaining portion of the cornea. CONCLUSIONS: This technique provides a quantitative and accurate approach to the problem of understanding the biomechanical nature of keratoconus. The implemented model has shown that changes in local material properties of the cornea and intraocular pressure are intrinsically related to keratoconus pathology and its shape/curvature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a nontrivial one-species population dynamics model with finite and infinite carrying capacities. Time-dependent intrinsic and extrinsic growth rates are considered in these models. Through the model per capita growth rate we obtain a heuristic general procedure to generate scaling functions to collapse data into a simple linear behavior even if an extrinsic growth rate is included. With this data collapse, all the models studied become independent from the parameters and initial condition. Analytical solutions are found when time-dependent coefficients are considered. These solutions allow us to perceive nontrivial transitions between species extinction and survival and to calculate the transition's critical exponents. Considering an extrinsic growth rate as a cancer treatment, we show that the relevant quantity depends not only on the intensity of the treatment, but also on when the cancerous cell growth is maximum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The findings of prior studies of air pollution effects on adverse birth outcomes are difficult to synthesize because of differences in study design. OBJECTIVES: The International Collaboration on Air Pollution and Pregnancy Outcomes was formed to understand how differences in research methods contribute to variations in findings. We initiated a feasibility study to a) assess the ability of geographically diverse research groups to analyze their data sets using a common protocol and b) perform location-specific analyses of air pollution effects on birth weight using a standardized statistical approach. METHODS: Fourteen research groups from nine countries participated. We developed a protocol to estimate odds ratios (ORs) for the association between particulate matter <= 10 mu m in aerodynamic diameter (PM(10)) and low birth weight (LBW) among term births, adjusted first for socioeconomic status (SES) and second for additional location-specific variables. RESULTS: Among locations with data for the PM(10) analysis, ORs estimating the relative risk of term LBW associated with a 10-mu g/m(3) increase in average PM(10) concentration during pregnancy, adjusted for SES, ranged from 0.63 [95% confidence interval (CI), 0.30-1.35] for the Netherlands to 1.15 (95% CI, 0.61-2.18) for Vancouver, with six research groups reporting statistically significant adverse associations. We found evidence of statistically significant heterogeneity in estimated effects among locations. CONCLUSIONS: Variability in PM(10)-LBW relationships among study locations remained despite use of a common statistical approach. A more detailed meta-analysis and use of more complex protocols for future analysis may uncover reasons for heterogeneity across locations. However, our findings confirm the potential for a diverse group of researchers to analyze their data in a standardized way to improve understanding of air pollution effects on birth outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. An analytical solution for the discrepancy between observed core-like profiles and predicted cusp profiles in dark matter halos is studied. Methods. We calculate the distribution function for Navarro-Frenk-White halos and extract energy from the distribution, taking into account the effects of baryonic physics processes. Results. We show with a simple argument that we can reproduce the evolution of a cusp to a flat density profile by a decrease of the initial potential energy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have measured the azimuthal anisotropy of pi(0) production for 1 < p(T) < 18 GeV/c for Au + Au collisions at root s(NN) = 200 GeV. The observed anisotropy shows a gradual decrease for 3 less than or similar to p(T) less than or similar to 7-10 GeV/c, but remains positive beyond 10 GeV/c. The magnitude of this anisotropy is underpredicted, up to at least similar to 10 GeV/c, by current perturbative QCD (PQCD) energy-loss model calculations. An estimate of the increase in anisotropy expected from initial-geometry modification due to gluon saturation effects and fluctuations is insufficient to account for this discrepancy. Calculations that implement a path-length dependence steeper than what is implied by current PQCD energy-loss models show reasonable agreement with the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The production of e(+)e(-) pairs for m(e+e-) < 0.3 GeV/c(2) and 1< p(T) < 5 GeV/c is measured in p + p and Au + Au collisions at root s(NN) = 200 GeV. An enhanced yield above hadronic sources is observed. Treating the excess as photon internal conversions, the invariant yield of direct photons is deduced. In central Au + Au collisions, the excess of the direct photon yield over p + p is exponential in transverse momentum, with an inverse slope T = 221 +/- 19(stat) +/- 19(syst) MeV. Hydrodynamical models with initial temperatures ranging from T(init) similar to 300-600 MeV at times of similar to 0.6-0.15 fm/c after the collision are in qualitative agreement with the data. Lattice QCD predicts a phase transition to quark gluon plasma at similar to 170 MeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermodynamic properties of bread dough (fusion enthalpy, apparent specific heat, initial freezing point and unfreezable water) were measured at temperatures from -40 degrees C to 35 degrees C using differential scanning calorimetry. The initial freezing point was also calculated based on the water activity of dough. The apparent specific heat varied as a function of temperature: specific heat in the freezing region varied from (1.7-23.1) J g(-1) degrees C(-1), and was constant at temperatures above freezing (2.7 J g(-1) degrees C(-1)). Unfreezable water content varied from (0.174-0.182) g/g of total product. Values of heat capacity as a function of temperature were correlated using thermodynamic models. A modification for low-moisture foodstuffs (such as bread dough) was successfully applied to the experimental data. (C) 2010 Elsevier Ltd. All rights reserved.