929 resultados para scientific computation
Resumo:
Various research fields, like organic agricultural research, are dedicated to solving real-world problems and contributing to sustainable development. Therefore, systems research and the application of interdisciplinary and transdisciplinary approaches are increasingly endorsed. However, research performance depends not only on self-conception, but also on framework conditions of the scientific system, which are not always of benefit to such research fields. Recently, science and its framework conditions have been under increasing scrutiny as regards their ability to serve societal benefit. This provides opportunities for (organic) agricultural research to engage in the development of a research system that will serve its needs. This article focuses on possible strategies for facilitating a balanced research evaluation that recognises scientific quality as well as societal relevance and applicability. These strategies are (a) to strengthen the general support for evaluation beyond scientific impact, and (b) to provide accessible data for such evaluations. Synergies of interest are found between open access movements and research communities focusing on global challenges and sustainability. As both are committed to increasing the societal benefit of science, they may support evaluation criteria such as knowledge production and dissemination tailored to societal needs, and the use of open access. Additional synergies exist between all those who scrutinise current research evaluation systems for their ability to serve scientific quality, which is also a precondition for societal benefit. Here, digital communication technologies provide opportunities to increase effectiveness, transparency, fairness and plurality in the dissemination of scientific results, quality assurance and reputation. Furthermore, funders may support transdisciplinary approaches and open access and improve data availability for evaluation beyond scientific impact. If they begin to use current research information systems that include societal impact data while reducing the requirements for narrative reports, documentation burdens on researchers may be relieved, with the funders themselves acting as data providers for researchers, institutions and tailored dissemination beyond academia.
Resumo:
A foundational model of concurrency is developed in this thesis. We examine issues in the design of parallel systems and show why the actor model is suitable for exploiting large-scale parallelism. Concurrency in actors is constrained only by the availability of hardware resources and by the logical dependence inherent in the computation. Unlike dataflow and functional programming, however, actors are dynamically reconfigurable and can model shared resources with changing local state. Concurrency is spawned in actors using asynchronous message-passing, pipelining, and the dynamic creation of actors. This thesis deals with some central issues in distributed computing. Specifically, problems of divergence and deadlock are addressed. For example, actors permit dynamic deadlock detection and removal. The problem of divergence is contained because independent transactions can execute concurrently and potentially infinite processes are nevertheless available for interaction.
Resumo:
This thesis takes an interdisciplinary approach to the study of color vision, focussing on the phenomenon of color constancy formulated as a computational problem. The primary contributions of the thesis are (1) the demonstration of a formal framework for lightness algorithms; (2) the derivation of a new lightness algorithm based on regularization theory; (3) the synthesis of an adaptive lightness algorithm using "learning" techniques; (4) the development of an image segmentation algorithm that uses luminance and color information to mark material boundaries; and (5) an experimental investigation into the cues that human observers use to judge the color of the illuminant. Other computational approaches to color are reviewed and some of their links to psychophysics and physiology are explored.
Resumo:
The dataflow model of computation exposes and exploits parallelism in programs without requiring programmer annotation; however, instruction- level dataflow is too fine-grained to be efficient on general-purpose processors. A popular solution is to develop a "hybrid'' model of computation where regions of dataflow graphs are combined into sequential blocks of code. I have implemented such a system to allow the J-Machine to run Id programs, leaving exposed a high amount of parallelism --- such as among loop iterations. I describe this system and provide an analysis of its strengths and weaknesses and those of the J-Machine, along with ideas for improvement.
Resumo:
In January 1983 a group of US government, industry and university information specialists gathered at MIT to take stock of efforts to monitor, acquire, assess, and disseminate Japanese scientific and technical information (JSTI). It was agreed that these efforts were uncoordinated and poorly conceived, and that a clearer understanding of Japanese technical information systems and a clearer sense of its importance to end users was necessary. That meeting led to formal technology assessments, Congressinal hearings, and legislation; it also helped stimulate several private initiatives in JSTI provision. Four years later there exist better coordinated and better conceived JSTI programs in both the public and private sectors, but there remains much room for improvement. This paper will recount their development and assess future directions.
Resumo:
A review article of the The New England Journal of Medicine refers that almost a century ago, Abraham Flexner, a research scholar at the Carnegie Foundation for the Advancement of Teaching, undertook an assessment of medical education in 155 medical schools in operation in the United States and Canada. Flexner’s report emphasized the nonscientific approach of American medical schools to preparation for the profession, which contrasted with the university-based system of medical education in Germany. At the core of Flexner’s view was the notion that formal analytic reasoning, the kind of thinking integral to the natural sciences, should hold pride of place in the intellectual training of physicians. This idea was pioneered at Harvard University, the University of Michigan, and the University of Pennsylvania in the 1880s, but was most fully expressed in the educational program at Johns Hopkins University, which Flexner regarded as the ideal for medical education. (...)
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Resumen tomado de la publicaci??n
Resumo:
Omnidirectional cameras offer a much wider field of view than the perspective ones and alleviate the problems due to occlusions. However, both types of cameras suffer from the lack of depth perception. A practical method for obtaining depth in computer vision is to project a known structured light pattern on the scene avoiding the problems and costs involved by stereo vision. This paper is focused on the idea of combining omnidirectional vision and structured light with the aim to provide 3D information about the scene. The resulting sensor is formed by a single catadioptric camera and an omnidirectional light projector. It is also discussed how this sensor can be used in robot navigation applications
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced
Resumo:
The scientific community has been suffering from peer review for decades. This process (also called refereeing) subjects an author's scientific work or ideas to the scrutiny of one or more experts in the field. Publishers use it to select and screen manuscript submissions, and funding agencies use it to award research funds. The goal is to get authors to meet their discipline's standards and thus achieve scientific objectivity. Publications and awards that haven't undergone peer review are often regarded with suspicion by scholars and professionals in many fields. However, peer review, although universally used, has many drawbacks. We propose replacing peer review with an auction-based approach: the better the submitted paper, the more scientific currency the author likely bid to have it published. If the bid correctly reflects the paper's quality, the author is rewarded in this new scientific currency; otherwise, the author loses this currency. We argue that citations are an appropriate currency for all scientists. We believe that citation auctions encourage scientists to better control their submissions' quality. It also inspire them to prepare more exciting talks for accepted papers and to invite discussion of their results at congresses and conferences and among their colleagues. In the long run, citation auctions could have the power to greatly improve scientific research
Resumo:
This paper describes the basis of citation auctions as a new approach to selecting scientific papers for publication. Our main idea is to use an auction for selecting papers for publication through - differently from the state of the art - bids that consist of the number of citations that a scientist expects to receive if the paper is published. Hence, a citation auction is the selection process itself, and no reviewers are involved. The benefits of the proposed approach are two-fold. First, the cost of refereeing will be either totally eliminated or significantly reduced, because the process of citation auction does not need prior understanding of the paper's content to judge the quality of its contribution. Additionally, the method will not prejudge the content of the paper, so it will increase the openness of publications to new ideas. Second, scientists will be much more committed to the quality of their papers, paying close attention to distributing and explaining their papers in detail to maximize the number of citations that the paper receives. Sample analyses of the number of citations collected in papers published in years 1999-2004 for one journal, and in years 2003-2005 for a series of conferences (in a totally different discipline), via Google scholar, are provided. Finally, a simple simulation of an auction is given to outline the behaviour of the citation auction approach
Resumo:
The design of control, estimation or diagnosis algorithms most often assumes that all available process variables represent the system state at the same instant of time. However, this is never true in current network systems, because of the unknown deterministic or stochastic transmission delays introduced by the communication network. During the diagnosing stage, this will often generate false alarms. Under nominal operation, the different transmission delays associated with the variables that appear in the computation form produce discrepancies of the residuals from zero. A technique aiming at the minimisation of the resulting false alarms rate, that is based on the explicit modelling of communication delays and on their best-case estimation is proposed
Resumo:
¿Cuáles son, en realidad, las ventajas de los dispositivos científicos usados en laboratorios escolares y en museos de ciencia interactivos, para el aprendizaje de las ciencias en los estudiantes? Un aprendizaje eficaz de las ciencias requiere comprensión. La generación de preguntas para obtener información es uno de los procesos que indican la intención de los estudiantes de comprender una determinada información. Además, la construcción de nuevo conocimiento científico comienza con una buena pregunta. Por tanto, estimular la generación de preguntas destinadas a obtener información (ISQ) podría ser un elemento que mejorara el aprendizaje profundo de las ciencias escolares
Resumo:
Crowdsourcing. Social Machines. Human computation. Co-construction Made Real