25 resultados para Mayr, Ernst: This is biology - the science of living world

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to “reproducibility maps” that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one’s own laboratory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recognition of an increasing and worldwide demand for high quality in fruits and vegetables has grown in recent years. Evidence of severe problems of mechanical damage is increasing, and this is affecting the trade of fruits in European and other countries. The potential market for fresh high-quality vegetables and fruits remains restricted by the lack of quality of the majority of products that reach consumers; this is the case for local as well as import/export markets, so a reduction in the consumption of fresh fruits in favour of other fixed-quality products (dairy in particular) may become widespread. In a recent survey (King, 1988, cited in Bellon, 1989), it appears that, for the moment, one third of the surveyed consumers are still continuing to increase their fresh produce consumption. The factors that appear as being most important in influencing the shopping behaviour of these consumers are taste/flavour, freshness/ripeness, appealing look, and cleanliness. Research on mechanical damage in fruit and vegetables has been underway for several years. The first research made on physical properties of fruits was in fact directed towards analysing the response to slow or rapid loading of selected fruits (Fridley et al, 1968; Horsefield et al., 1972). From that time on, research has expanded greatly, and different aspects of the problem have been approached. These include applicable mechanical models for the contact problem, the response of biological tissues to loading, devices for detecting damage causes in machines and equipment, and procedures for sensing bruises in grading and sorting. This chapter will be devoted to the study of actual research results relative to the cause and mechanisms of mechanical damage in fruits (secondarily in vegetables), the development of bruises in these commodities, the models that have been used up to now, and the different factors which have been recognized as influencing the appearance and development of mechanical damage in fruits. The study will be focused mainly on contact-damage - that is, slow or rapid loads applied to the surface of the products and causing bruises. (A bruise is defined as an altered volume of fruit tissues below the skin that is discoloured and softened.) Other types of mechanical damage, like abrasion and scuffing, punctures and cuts, will be also mentioned briefly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to solve a question raised for average sampling in shift-invariant spaces by using the well-known matrix pencil theory. In many common situations in sampling theory, the available data are samples of some convolution operator acting on the function itself: this leads to the problem of average sampling, also known as generalized sampling. In this paper we deal with the existence of a sampling formula involving these samples and having reconstruction functions with compact support. Thus, low computational complexity is involved and truncation errors are avoided. In practice, it is accomplished by means of a FIR filter bank. An answer is given in the light of the generalized sampling theory by using the oversampling technique: more samples than strictly necessary are used. The original problem reduces to finding a polynomial left inverse of a polynomial matrix intimately related to the sampling problem which, for a suitable choice of the sampling period, becomes a matrix pencil. This matrix pencil approach allows us to obtain a practical method for computing the compactly supported reconstruction functions for the important case where the oversampling rate is minimum. Moreover, the optimality of the obtained solution is established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The penalty corner is one of the most important game situations in field hockey with one third of all goals resulting from this tactical situation. The aim of this study was to develop and apply a training method, based on previous studies, to improve the drag- flick skill on a young top-class field hockey player. A young top-class player exercised three times per week using specific drills over a four week period. A VICON optoelectronic system (Oxford Metrics, Oxford, UK) was employed to capture twenty drag-flicks, with six cameras sampling at 250 Hz, prior and after the training period. In order to analyze pre- and post-test differences a dependent t-test was carried out. Angular velocities and the kinematic sequence were similar to previous studies. The player improved (albeit not significantly) the angular velocity of the stick. The player increased front foot to the ball at T1 (p < 0.01) and the drag-flick distances. The range of motion from the front leg decreased from T1 to T6 after the training period (p < 0.01). The specific training sessions conducted with the player improved some features of this particular skill. This article shows how technical knowledge can help with the design of training programs and whether some drills are more effective than others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows the development of a science-technological knowledge transfer model in Mexico, as a means to boost the limited relations between the scientific and industrial environments. The proposal is based on the analysis of eight organizations (research centers and firms) with varying degrees of skill in the practice of science-technological knowledge transfer, and carried out by the case study approach. The analysis highlights the synergistic use of the organizational and technological capabilities of each organization, as a means to identification of the knowledge transfer mechanisms best suited to enabling the establishment of cooperative processes, and achieve the R&D and innovation activities results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents the model of a multi-agent system (SMAF), which objectives are the input of fuzzy incidents as the human experts express them with different severities degrees and the further search and suggestion of solutions. The solutions will be later confirm or not by the users. This model was designed, implemented and tested in the telecommunications field, with heterogeneous agents in a cooperative model. In the design, different abstract levels where considered, according to the agents? objectives, their ways to carry it out and the environment in which they act. Each agent is modeled with different spectrum of the knowledge base

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper illustrates the use of a top-down framework to obtain goal independent analyses of logic programs, a task which is usually associated with the bottom-up approach. While it is well known that the bottomup approach can be used, through the magic set transformation, for goal dependent analysis, it is less known that the top-down approach can be used for goal independent analysis. The paper describes two ways of doing the latter. We show how the results of a goal independent analysis can be used to speed up subsequent goal dependent analyses. However this speed-up may result in a loss of precisión. The influence of domain characteristics on this precisión is discussed and an experimental evaluation using a generic top-down analyzer is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light detection and ranging (LiDAR) technology is beginning to have an impact on agriculture. Canopy volume and/or fruit tree leaf area can be estimated using terrestrial laser sensors based on this technology. However, the use of these devices may have different options depending on the resolution and scanning mode. As a consequence, data accuracy and LiDAR derived parameters are affected by sensor configuration, and may vary according to vegetative characteristics of tree crops. Given this scenario, users and suppliers of these devices need to know how to use the sensor in each case. This paper presents a computer program to determine the best configuration, allowing simulation and evaluation of different LiDAR configurations in various tree structures (or training systems). The ultimate goal is to optimise the use of laser scanners in field operations. The software presented generates a virtual orchard, and then allows the scanning simulation with a laser sensor. Trees are created using a hidden Markov tree (HMT) model. Varying the foliar structure of the orchard the LiDAR simulation was applied to twenty different artificially created orchards with or without leaves from two positions (lateral and zenith). To validate the laser sensor configuration, leaf surface of simulated trees was compared with the parameters obtained by LiDAR measurements: the impacted leaf area, the impacted total area (leaves and wood), and th impacted area in the three outer layers of leaves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Teamwork, is one of the abilities that today is highly valued in the professional arena with a great importance for various personal and interpersonal skills associated with it. In this context, the Technical University of Madrid, is developing a coordinated educational innovation project, which main objective is to develop methodological and assessment tools for the acquisition of personal skills necessary to improve the employability of graduates and their skills for project management. Within this context, this paper proposes a methodology composed of various activities and indicators, as well as specific assessment instruments linked to the teamwork competence. Through a series of systematic steps it was allowed the design of an instrument and construction of a scale for measuring the competence of teamwork. The practical application of the methodology has been carried out in Projects lectures from different Schools of Engineering at the Technical University of Madrid, which results are presented in this document as a pilot experience. Results show the various aspects and methods that teachers should consider in evaluating the competence of the work, including analysis of the quality of results, through reliability and construct validity. On the other hand, show the advantages of applying this methodology in the field of project management teaching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transport is responsible for 41% of CO2 emissions in Spain, and around 65% of that figure is due to road traffic. Tolled motorways are currently managed according to economic criteria: minimizing operational costs and maximizing revenues from tolls. Within this framework, this paper develops a new methodology for managing motorways based on a target of maximum energy efficiency. It includes technological and demand-driven policies, which are applied to two case studies. Various conclusions emerge from this study. One is, that the use of intelligent payment systems is recommended; and another, is that the most sustainable policy would involve defining the most efficient strategy for each motorway section, including the maximum use of its capacity, the toll level which attracts the most vehicles, and the optimum speed limit for each type of vehicle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, the dimensional synthesis of a spherical Parallel Manipulator (PM) with a -1S kinematic chain is presented. The goal of the synthesis is to find a set of parameters that defines the PM with the best performance in terms of workspace capabilities, dexterity and isotropy. The PM is parametrized in terms of a reference element, and a non-directed search of these parameters is carried out. First, the inverse kinematics and instantaneous kinematics of the mechanism are presented. The latter is found using the screw theory formulation. An algorithm that explores a bounded set of parameters and determines the corresponding value of global indexes is presented. The concepts of a novel global performance index and a compound index are introduced. Simulation results are shown and discussed. The best PMs found in terms of each performance index evaluated are locally analyzed in terms of its workspace and local dexterity. The relationship between the performance of the PM and its parameters is discussed, and a prototype with the best performance in terms of the compound index is presented and analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The contribution to global energy consumption of the information and communications technology (ICT) sector has increased considerably in the last decade, along with its growing relevance to the overall economy. This trend will continue due to the seemingly ever greater use of these technologies, with broadband data traffic generated by the usage of telecommunication networks as a primary component. In fact, in response to user demand, the telecommunications industry is initiating the deployment of next generation networks (NGNs). However, energy consumption is mostly absent from the debate on these deployments, in spite of the potential impact on both expenses and sustainability. In addition, consumers are unaware of the energy impact of their choices in ultra-broadband services. This paper focuses on forecasting energy consumption in the access part of NGNs by modelling the combined effect of the deployment of two different ultra-broadband technologies (FTTH-GPON and LTE), the evolution of traffic per user, and the energy consumption in each of the networks and user devices. Conclusions are presented on the levels of energy consumption, their cost and the impact of different network design parameters. The effect of technological developments, techno-economic and policy decisions on energy consumption is highlighted. On the consumer side, practical figures and comparisons across technologies are provided. Although the paper focuses on Spain, the analysis can be extended to similar countries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An accepted fact in software engineering is that software must undergo verification and validation process during development to ascertain and improve its quality level. But there are too many techniques than a single developer could master, yet, it is impossible to be certain that software is free of defects. So, it is crucial for developers to be able to choose from available evaluation techniques, the one most suitable and likely to yield optimum quality results for different products. Though, some knowledge is available on the strengths and weaknesses of the available software quality assurance techniques but not much is known yet on the relationship between different techniques and contextual behavior of the techniques. Objective: This research investigates the effectiveness of two testing techniques ? equivalence class partitioning and decision coverage and one review technique ? code review by abstraction, in terms of their fault detection capability. This will be used to strengthen the practical knowledge available on these techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work compared the quantification of soluble fibre in feeds using different chemical and in vitro approaches, and studied the potential interference between soluble fibre and mucin determinations. Six ingredients: sugar beet pulp (SBP), SBP pectins, insoluble SBP, wheat straw, sunflower hulls and lignocellulose, and seven rabbit diets, differing in soluble fibre content, were evaluated. In experiment 1, ingredients and diets were analyzed for total dietary fibre (TDF), insoluble dietary fibre (IDF), soluble dietary fibre (SDF), aNDFom (corrected for protein, aNDFom-cp) and 2-step pepsin/pancreatin in vitro DM indigestibility (corrected for ash and protein, ivDMi2). Soluble fibre was estimated by difference using three procedures: TDF?IDF (SDFIDF), TDF?ivDMi2 (SDFivDMi2), and TDF?aNDFom-cp (SDFaNDFom-cp). Soluble fibre determined directly (SDF) or by difference as SDFivDMi2 were not different (109 g/kg DM, on average). However, when it was calculated as SDFaNDFom-cp the value was 40% higher (153 g/kg DM, P menor que 0.05), whereas SDFIDF (124 g/kg DM) did not differ from any of the other methods. The correlation between the four methods was high (r ? 0.96; P ? 0.001; n = 13), but it decreased or even disappeared when SBP pectins and SBP were excluded and a lower and more narrow range of variation of soluble fibre was used. In experiment 2, the ivDMi2 using crucibles (reference method) were compared to those made using individual or collective ankom bags in order to simplify the determination of SDFivDMi2. The ivDMi2 was not different when using crucibles or individual or collective ankom bags. In experiment 3, the potential interference between soluble fibre and intestinal mucin determinations was studied using rabbit intestinal raw mucus, digesta and SBP pectins, lignocelluloses and a rabbit diet. An interference was observed between the determinations of soluble fibre and crude mucin, as contents of TDF and apparent crude mucin were high in SBP pectins (994 and 709 g/kg DM) and rabbit intestinal raw mucus (571 and 739 g/kg DM). After a pectinase treatment, the coefficient of apparent mucin recovery of SBP pectins was close to zero, whereas that of rabbit mucus was not modified. An estimation of the crude mucin carbohydrates retained in digesta TDF is proposed to correct TDF and soluble fibre digestibility. In conclusion, the values of soluble fibre depend on the methodology used. The contamination of crude mucin with soluble fibre is avoided using pectinase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New digital artifacts are emerging in data-intensive science. For example, scientific workflows are executable descriptions of scientific procedures that define the sequence of computational steps in an automated data analysis, supporting reproducible research and the sharing and replication of best-practice and know-how through reuse. Workflows are specified at design time and interpreted through their execution in a variety of situations, environments, and domains. Hence it is essential to preserve both their static and dynamic aspects, along with the research context in which they are used. To achieve this, we propose the use of multidimensional digital objects (Research Objects) that aggregate the resources used and/or produced in scientific investigations, including workflow models, provenance of their executions, and links to the relevant associated resources, along with the provision of technological support for their preservation and efficient retrieval and reuse. In this direction, we specified a software architecture for the design and implementation of a Research Object preservation system, and realized this architecture with a set of services and clients, drawing together practices in digital libraries, preservation systems, workflow management, social networking and Semantic Web technologies. In this paper, we describe the backbone system of this realization, a digital library system built on top of dLibra.