982 resultados para 437
Resumo:
R. Daly, Q. Shen and S. Aitken. Using ant colony optimisation in learning Bayesian network equivalence classes. Proceedings of the 2006 UK Workshop on Computational Intelligence, pages 111-118.
Resumo:
Schierz, A.C., L. Soldatova, R.D. King. Overhauling the PDB. Nature Biotechnology, 25, No. 4, April 2007, pp 437 - 442. Sponsorship: Project IQ, EU IST-FET FP6-516169 BBSRC Project 2/BEP17028
Resumo:
Abstract. The paper presents a list of 437 verbs which have not been recorded in lexicography. As the source of reference, the author consulted a spelling dictionary of the Polish language, Wielki słownik ortograficzny PWN, 2nd edition, 2006. The concept of this paper originated from the wish to satisfy the curiosity after reading Indeks neologizmów [Index of Neologisms] prepared by Krystyna Waszakowa in her work Przejawy internacjonalizacji w słowotwórstwie współczesnej polszczyzny [Word-formative internationalisation processes in modern Polish]. The index contains a list of nouns. Given that K. Waszakowa did not take verbs into account (there are far (?) fewer neo-verbs than neo-nouns), the author decided to find out whether it is true that the number of verb neologisms is so small that their philological analysis is pointless from the point of view of research, vocabulary registration, etc. If nouns, such as podczłowiek, miniokupacja, redefinicja, are of interest, why not record the prefixal constructions of the do-, z-, od-, na-, w-, wy-, za-, od-, nad- etc. -robić type? The analysis included randomly selected texts from the „Rzeczpospolita” daily (without any thorough preparation with respect to the content; the texts available were sequentially analysed until the satisfactory result was obtained). The texts under review included an incomplete (it is virtually impossible to determine completeness in this case) electronic archive from the years 1993–2006.
Resumo:
16 hojas.
Resumo:
In 1986, New Zealand responded to the open-access problem by establishing the world's largest individual transferable quota (ITQ) system. Using a 15-year panel dataset from New Zealand that covers 33 species and more than 150 markets for fishing quotas, we assess trends in market activity, price dispersion, and the fundamentals determining quota prices. We find that market activity is sufficiently high in the economically important markets and that price dispersion has decreased. We also find evidence of economically rational behavior through the relationship between quota lease and sale prices and fishing output and input prices, ecological variability, and market interest rates. Controlling for these factors, our results show a greater increase in quota prices for fish stocks that faced significant reductions, consistent with increased profitability due to rationalization. Overall, this suggests that these markets are operating reasonably well, implying that ITQs can be effective instruments for efficient fisheries management. © 2004 Elsevier Inc. All rights reserved.
Resumo:
Absolute line intensities in the v6 and v8 interacting bands of trans-HCOOH, observed near 1105.4 and 1033.5 cm -1, respectively, and the dissociation constant of the formic acid dimer (HCOOH)2 have been measured using Fourier transform spectroscopy at a resolution of 0.002 cm-1. Eleven spectra of formic acid, at 296.0(5) K and pressures ranging from 14.28(25) to 314.0(24) Pa, have been recorded between 600 and 1900 cm-1 with an absorption path length of 19.7(2) cm. 437 integrated absorption coefficients have been measured for 72 lines in the v6 band. Analysis of the pressure dependence yielded the dissociation constant of the formic acid dimer, k p=361(45) Pa, and the absolute intensity of the 72 lines of HCOOH. The accuracy of these results was carefully estimated. The absolute intensities of four lines of the weak v8 band were also measured. Using an appropriate theory, the integrated intensity of the v6 and v 8 bands was determined to be 3.47 × 1017 and 4.68 × 10-19 cm-1/(molecule cm-1) respectively, at 296 K. Both the dissociation constant and integrated intensities were compared to earlier measurements. © 2007 American Institute of Physics.
Resumo:
La periodicidad como propiedad es identificada de manera natural por los individuos y resulta habitual el uso de los significados creados de forma compartida y que éstos se trasladen en contextos diferentes en donde son aplicados. Los resultados obtenidos en investigaciones como Buendía (2004, 2005a) y Alcaraz (2005) aportan no sólo elementos de corte cognitivo, sino herramientas que fungen como argumentos válidos en el reconocimiento de la naturaleza periódica. Lo periódico puede conformar todo un lenguaje, abarcando los ámbitos culturales, históricos e institucionales y procurándole un carácter útil al conocimiento matemático. La unidad de análisis es el elemento que tiende un puente entre un tratamiento empírico de la periodicidad y uno científico (Montiel, 2005), lo cual favorece una construcción significativa del conocimiento matemático. Nuestro marco teórico es la aproximación socioepistemológica la cual centra su atención en el examen de las prácticas sociales, entendidas como las acciones o actividades realizadas intencionalmente con un objetivo de transformación y con ayuda de herramientas que favorecen la construcción del conocimiento matemático, incluso antes que estudiar a los conocimientos mismos.
Resumo:
This paper describes the application of computational fluid dynamics (CFD) to simulate the macroscopic bulk motion of solder paste ahead of a moving squeegee blade in the stencil printing process during the manufacture of electronic components. The successful outcome of the stencil printing process is dependent on the interaction of numerous process parameters. A better understanding of these parameters is required to determine their relation to print quality and improve guidelines for process optimization. Various modelling techniques have arisen to analyse the flow behaviour of solder paste, including macroscopic studies of the whole mass of paste as well as microstructural analyses of the motion of individual solder particles suspended in the carrier fluid. This work builds on the knowledge gained to date from earlier analytical models and CFD investigations by considering the important non-Newtonian rheological properties of solder pastes which have been neglected in previous macroscopic studies. Pressure and velocity distributions are obtained from both Newtonian and non-Newtonian CFD simulations and evaluated against each other as well as existing established analytical models. Significant differences between the results are observed, which demonstrate the importance of modelling non-Newtonian properties for realistic representation of the flow behaviour of solder paste.
Resumo:
In judicial decision making, the doctrine of chances takes explicitly into account the odds. There is more to forensic statistics, as well as various probabilistic approaches which taken together form the object of an enduring controversy in the scholarship of legal evidence. In this paper, we reconsider the circumstances of the Jama murder and inquiry (dealt with in Part I of this paper: "The Jama Model. On Legal Narratives and Interpretation Patterns"), to illustrate yet another kind of probability or improbability. What is improbable about the Jama story, is actually a given, which contributes in terms of dramatic underlining. In literary theory, concepts of narratives being probable or improbable date back from the eighteenth century, when both prescientific and scientific probability was infiltrating several domains, including law. An understanding of such a backdrop throughout the history of ideas is, I claim, necessary for AI researchers who may be tempted to apply statistical methods to legal evidence. The debate for or against probability (and especially bayesian probability) in accounts of evidence has been flouishing among legal scholars. Nowadays both the the Bayesians (e.g. Peter Tillers) and Bayesioskeptics (e.g. Ron Allen) among those legal scholars whoare involved in the controversy are willing to give AI researchers a chance to prove itself and strive towards models of plausibility that would go beyond probability as narrowly meant. This debate within law, in turn, has illustrious precedents: take Voltaire, he was critical of the application or probability even to litigation in civil cases; take Boole, he was a starry-eyed believer in probability applications to judicial decision making (Rosoni 1995). Not unlike Boole, the founding father of computing, nowadays computer scientists approaching the field may happen to do so without full awareness of the pitfalls. Hence, the usefulness of the conceptual landscape I sketch here.
Resumo:
The continuous plankton recorder (CPR) survey is the largest multi-decadal plankton monitoring programme in the world. It was initiated in 1931 and by the end of 2004 had counted 207,619 samples and identified 437 phyto- and zooplankton taxa throughout the North Atlantic. CPR data are used extensively by the research community and in recent years have been used increasingly to underpin marine management. Here, we take a critical look at how best to use CPR data. We first describe the CPR itself, CPR sampling, and plankton counting procedures. We discuss the spatial and temporal biases in the Survey, summarise environmental data that have not previously been available, and describe the new data access policy. We supply information essential to using CPR data, including descriptions of each CPR taxonomic entity, the idiosyncrasies associated with counting many of the taxa, the logic behind taxonomic changes in the Survey, the semi-quantitative nature of CPR sampling, and recommendations on choosing the spatial and temporal scale of study. This forms the basis for a broader discussion on how to use CPR data for deriving ecologically meaningful indices based on size, functional groups and biomass that can be used to support research and management. This contribution should be useful for plankton ecologists, modellers and policy makers that actively use CPR data.