159 resultados para constraint satisfaction problem
Resumo:
Three regular polyhedra are called nested if they have the same number of vertices n, the same center and the positions of the vertices of the inner polyhedron ri, the ones of the medium polyhedron Ri and the ones of the outer polyhedron Ri satisfy the relation Ri = ri and Ri = Rri for some scale factors R > > 1 and for all i = 1, . . . , n. We consider 3n masses located at the vertices of three nested regular polyhedra. We assume that the masses of the inner polyhedron are equal to m1, the masses of the medium one are equal to m2, and the masses of the outer one are equal to m3. We prove that if the ratios of the masses m2/m1 and m3/m1 and the scale factors and R satisfy two convenient relations, then this configuration is central for the 3n–body problem. Moreover there is some numerical evidence that, first, fixed two values of the ratios m2/m1 and m3/m1, the 3n–body problem has a unique central configuration of this type; and second that the number of nested regular polyhedra with the same number of vertices forming a central configuration for convenient masses and sizes is arbitrary.
Resumo:
Background: Pharmacogenetic studies are essential in understanding the interindividual variability of drug responses. DNA sample collection for genotyping is a critical step in genetic studies. A method using dried blood samples from finger-puncture, collected on DNA-cards, has been described as an alternative to the usual venepuncture technique. The purpose of this study is to evaluate the implementation of the DNA cards method in a multicentre clinical trial, and to assess the degree of investigators' satisfaction and the acceptance of the patients perceived by the investigators.Methods: Blood samples were collected on DNA-cards. The quality and quantity of DNA recovered were analyzed. Investigators were questioned regarding their general interest, previous experience, safety issues, preferences and perceived patient satisfaction. Results: 151 patients' blood samples were collected. Genotyping of GST polymorphisms was achieved in all samples (100%). 28 investigators completed the survey. Investigators perceived patient satisfaction as very good (60.7%) or good (39.3%), without reluctance to finger puncture. Investigators preferred this method, which was considered safer and better than the usual methods. All investigators would recommend using it in future genetic studies. Conclusion: Within the clinical trial setting, the DNA-cards method was very well accepted by investigators and patients (in perception of investigators), and was preferred to conventional methods due to its ease of use and safety.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
In this paper, a hybrid simulation-based algorithm is proposed for the StochasticFlow Shop Problem. The main idea of the methodology is to transform the stochastic problem into a deterministic problem and then apply simulation to the latter. In order to achieve this goal, we rely on Monte Carlo Simulation and an adapted version of a deterministic heuristic. This approach aims to provide flexibility and simplicity due to the fact that it is not constrained by any previous assumption and relies in well-tested heuristics.
Resumo:
The existence of a new class of inclined periodic orbits of the collision restricted three-body problem is shown. The symmetric periodic solutions found are perturbations of elliptic kepler orbits and they exist only for special values of the inclination and are related to the motion of a satellite around an oblate planet
Resumo:
Amino-N is preserved because of the scarcity and nutritional importance of protein. Excretion requires its conversion to ammonia, later incorporated into urea. Under conditions of excess dietary energy, the body cannot easily dispose of the excess amino-N against the evolutively adapted schemes that prevent its wastage; thus ammonia and glutamine formation (and urea excretion) are decreased. High lipid (and energy) availability limits the utilisation of glucose, and high glucose spares the production of ammonium from amino acids, limiting the synthesis of glutamine and its utilisation by the intestine and kidney. The amino acid composition of the diet affects the production of ammonium depending on its composition and the individual amino acid catabolic pathways. Surplus amino acids enhance protein synthesis and growth, and the synthesis of non-protein-N-containing compounds. But these outlets are not enough; consequently, less-conventional mechanisms are activated, such as increased synthesis of NO∙ followed by higher nitrite (and nitrate) excretion and changes in the microbiota. There is also a significant production of N(2) gas, through unknown mechanisms. Health consequences of amino-N surplus are difficult to fathom because of the sparse data available, but it can be speculated that the effects may be negative, largely because the fundamental N homeostasis is stretched out of normalcy, forcing the N removal through pathways unprepared for that task. The unreliable results of hyperproteic diets, and part of the dysregulation found in the metabolic syndrome may be an unwanted consequence of this N disposal conflict.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
The continuous wavelet transform is obtained as a maximumentropy solution of the corresponding inverse problem. It is well knownthat although a signal can be reconstructed from its wavelet transform,the expansion is not unique due to the redundancy of continuous wavelets.Hence, the inverse problem has no unique solution. If we want to recognizeone solution as "optimal", then an appropriate decision criterion hasto be adopted. We show here that the continuous wavelet transform is an"optimal" solution in a maximum entropy sense.
Resumo:
The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.
Resumo:
A detailed mathematical analysis on the q = 1/2 non-extensive maximum entropydistribution of Tsallis' is undertaken. The analysis is based upon the splitting of such adistribution into two orthogonal components. One of the components corresponds to theminimum norm solution of the problem posed by the fulfillment of the a priori conditionson the given expectation values. The remaining component takes care of the normalizationconstraint and is the projection of a constant onto the Null space of the "expectation-values-transformation"
Resumo:
A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.
Resumo:
Drawing on a very rich data set from a recent cohort of PhD graduates, we examine the correlates and consequences of qualification and skills mismatch. We show that job characteristics such as the economic sector and the main activity at work play a fundamental direct role in explaining the probability of being well matched. However, the effect of academic attributes seems to be mainly indirect, since it disappears once we control for the full set of work characteristics. We detected a significant earnings penalty for those who are both overqualified and overskilled and also showed that being mismatched reduces job satisfaction, especially for those whose skills are underutilized. Overall, the problem of mismatch among PhD graduates is closely related to demand-side constraints of the labor market. Increasing the supply of adequate jobs and broadening the skills PhD students acquire during training should be explored as possible responses.