854 resultados para Generalization Problem
Resumo:
The pion spectrum for charged and neutral pions is investigated in pure neutron matter, by letting the pions interact with a neutron Fermi sea in a self-consistent scheme that renormalizes simultaneously the mesons, considered the source of the interaction, and the nucleons. The possibility of obtaining different kinds of pion condensates is investigated with the result that they cannot be reached even for values of the spin-spin correlation parameter, g', far below the range commonly accepted.
Resumo:
We show that the dispersal routes reconstruction problem can be stated as an instance of a graph theoretical problem known as the minimum cost arborescence problem, for which there exist efficient algorithms. Furthermore, we derive some theoretical results, in a simplified setting, on the possible optimal values that can be obtained for this problem. With this, we place the dispersal routes reconstruction problem on solid theoretical grounds, establishing it as a tractable problem that also lends itself to formal mathematical and computational analysis. Finally, we present an insightful example of how this framework can be applied to real data. We propose that our computational method can be used to define the most parsimonious dispersal (or invasion) scenarios, which can then be tested using complementary methods such as genetic analysis.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
Hematocrit (Hct) is one of the most critical issues associated with the bioanalytical methods used for dried blood spot (DBS) sample analysis. Because Hct determines the viscosity of blood, it may affect the spreading of blood onto the filter paper. Hence, accurate quantitative data can only be obtained if the size of the paper filter extracted contains a fixed blood volume. We describe for the first time a microfluidic-based sampling procedure to enable accurate blood volume collection on commercially available DBS cards. The system allows the collection of a controlled volume of blood (e.g., 5 or 10 μL) within several seconds. Reproducibility of the sampling volume was examined in vivo on capillary blood by quantifying caffeine and paraxanthine on 5 different extracted DBS spots at two different time points and in vitro with a test compound, Mavoglurant, on 10 different spots at two Hct levels. Entire spots were extracted. In addition, the accuracy and precision (n = 3) data for the Mavoglurant quantitation in blood with Hct levels between 26% and 62% were evaluated. The interspot precision data were below 9.0%, which was equivalent to that of a manually spotted volume with a pipet. No Hct effect was observed in the quantitative results obtained for Hct levels from 26% to 62%. These data indicate that our microfluidic-based sampling procedure is accurate and precise and that the analysis of Mavoglurant is not affected by the Hct values. This provides a simple procedure for DBS sampling with a fixed volume of capillary blood, which could eliminate the recurrent Hct issue linked to DBS sample analysis.
Resumo:
The Iowa Department of Transportation (DOT) is continually improving the pavement management program and striving to reduce maintenance needs. Through a 1979 pavement management study, the Iowa DOT became a participant in a five state Federal Highway Administration (FHWA) study of "Transverse Cracking of Asphalt Pavements". There were numerous conclusions and recommendations but no agreement as to the major factors contributing to transverse cracking or methods of preventing or reducing the occurrence of transverse cracking. The project did focus attention on the problem and generated ideas for research. This project is one of two state funded research projects that were a direct result of the FHWA project. Iowa DOT personnel had been monitoring temperature susceptibility of asphalt cements by the Norman McLeod Modified Penetration Index. Even though there are many variables from one asphalt mix to another, the trend seemed to indicate that the frequency of transverse cracking was highly dependent on the temperature susceptibility. Research project HR-217 "Reducing the Adverse Effects of Transverse Cracking" was initiated to verify the concept. A final report has been published after a four-year evaluation. The crack frequency with the high temperature susceptible asphalt cement was substantially greater than for the low temperature susceptible asphalt cement. An increased asphalt cement content in the asphalt treated base also reduced the crack frequency. This research on prevention of transverse cracking with fabric supports the following conclusions: 1. Engineering fabric does not prevent transverse cracking of asphalt cement concrete. 2. Engineering fabric may retard the occurrence of transverse cracking. 3. Engineering fabric does not contribute significantly to the structural capability of an asphalt concrete pavement.
Resumo:
For numerous shelly invertebrates, Cope's rule is shown in this paper to merely describe the particular case where volume increase is strictly coupled with diameter or length. Allometries, which are frequently observed in the evolution of the shells' geometry, mean that their size, volume and surface can vary independently. The consequences of this can be summarized as follows : 1) volume increase not coupled with an increase of diameter or length of the organisms generates increasing involution and/or lateral width in the shell of cephalopods, foraminifera and radiolarians; 2) an increase of the biomineralizing surface, not coupled with volume increase, generates increasing apparent complexity in the sutures and growth lines in ammonites, and an increase in the complexity and number of chambers in foraminifera.
Resumo:
A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.