902 resultados para Distributed computer-controlled systems
Resumo:
An experimental procedure for precise and accurate measurements of isotope abundances by a miniature laser ablation mass spectrometer for space research is described. The measurements were conducted on different untreated NIST standards and galena samples by applying pulsed UV laser radiation (266 nm, 3 ns and 20 Hz) for ablation, atomisation, and ionisation of the sample material. Mass spectra of released ions are measured by a reflectron-type time-of-flight mass analyser. A computer controlled performance optimiser was used to operate the system at maximum ion transmission and mass resolution. At optimal experimental conditions, the best relative accuracy and precision achieved for Pb isotope compositions are at the per mill level and were obtained in a range of applied laser irradiances and a defined number of accumulated spectra. A similar relative accuracy and precision was achieved in the study of Pb isotope compositions in terrestrial galena samples. The results for the galena samples are similar to those obtained with a thermal ionisation mass spectrometer (TIMS). The studies of the isotope composition of other elements yielded relative accuracy and precision at the per mill level too, with characteristic instrument parameters for each element. The relative accuracy and precision of the measurements is degrading with lower element/isotope concentration in a sample. For the elements with abundances below 100 ppm these values drop to the percent level. Depending on the isotopic abundances of Pb in minerals, 207Pb/206Pb ages with accuracy in the range of tens of millions of years can be achieved.
Resumo:
Diet-related chronic diseases severely affect personal and global health. However, managing or treating these diseases currently requires long training and high personal involvement to succeed. Computer vision systems could assist with the assessment of diet by detecting and recognizing different foods and their portions in images. We propose novel methods for detecting a dish in an image and segmenting its contents with and without user interaction. All methods were evaluated on a database of over 1600 manually annotated images. The dish detection scored an average of 99% accuracy with a .2s/image run time, while the automatic and semi-automatic dish segmentation methods reached average accuracies of 88% and 91% respectively, with an average run time of .5s/image, outperforming competing solutions.
Resumo:
Diet management is a key factor for the prevention and treatment of diet-related chronic diseases. Computer vision systems aim to provide automated food intake assessment using meal images. We propose a method for the recognition of already segmented food items in meal images. The method uses a 6-layer deep convolutional neural network to classify food image patches. For each food item, overlapping patches are extracted and classified and the class with the majority of votes is assigned to it. Experiments on a manually annotated dataset with 573 food items justified the choice of the involved components and proved the effectiveness of the proposed system yielding an overall accuracy of 84.9%.
Resumo:
Aims. We present an inversion method based on Bayesian analysis to constrain the interior structure of terrestrial exoplanets, in the form of chemical composition of the mantle and core size. Specifically, we identify what parts of the interior structure of terrestrial exoplanets can be determined from observations of mass, radius, and stellar elemental abundances. Methods. We perform a full probabilistic inverse analysis to formally account for observational and model uncertainties and obtain confidence regions of interior structure models. This enables us to characterize how model variability depends on data and associated uncertainties. Results. We test our method on terrestrial solar system planets and find that our model predictions are consistent with independent estimates. Furthermore, we apply our method to synthetic exoplanets up to 10 Earth masses and up to 1.7 Earth radii, and to exoplanet Kepler-36b. Importantly, the inversion strategy proposed here provides a framework for understanding the level of precision required to characterize the interior of exoplanets. Conclusions. Our main conclusions are (1) observations of mass and radius are sufficient to constrain core size; (2) stellar elemental abundances (Fe, Si, Mg) are principal constraints to reduce degeneracy in interior structure models and to constrain mantle composition; (3) the inherent degeneracy in determining interior structure from mass and radius observations does not only depend on measurement accuracies, but also on the actual size and density of the exoplanet. We argue that precise observations of stellar elemental abundances are central in order to place constraints on planetary bulk composition and to reduce model degeneracy. We provide a general methodology of analyzing interior structures of exoplanets that may help to understand how interior models are distributed among star systems. The methodology we propose is sufficiently general to allow its future extension to more complex internal structures including hydrogen- and water-rich exoplanets.
Resumo:
The aim of this work was to evaluate changes in growth and productivity parameters of different precocious hybrids and a naturalized variety of papaya under both greenhouse and field cultivation in a temperate climate (the center of the province of Santa Fe, Argentina). In view of the aforesaid, the purpose of our research was to identify further genotypes better suited for the cultivation of this species in temperate climates and demonstrate the need for the use of semi-controlled systems to make possible the cultivation of these promising genotypes in middle latitudes. The average yield was 291% higher in greenhouse than in the field. The average productivity for hybrid genotypes compared with the naturalized variety more than doubled in both environments. Considering behavior in height, leaf area index and yield parameters, hybrids H2 (principally), and H4 showed a great adaptation for use in semi-forced systems. The use of greenhouse and short stature papaya hybrids allows its feasible and surely profitable cultivation in non- tropical climates.
Resumo:
The wet bulk density is one of the most important parameters of the physical and geological properties of marine sediments. The density is connected directly with sedimentation history and a few sedirnent properties. Knowledge of the fine scale density-depth structure is the base for many model calculations, for both sedimentological and palaeoclimatic research. A density measurement system was designed and built at the Alfred Wegener Institute in Bremerhaven for measuring the wet buk density of sediment cores with high resolution in a non-destructive way. The density is deterrnined by measuring the absorption of Gamma-rays in the sediment. This principle has been used since the 50's in materials research and in the geosciences. In the present case, Cs137 is used as the radioactive source and the intensity is measured by a detector system (scintillator and photomultiplier). Density values are obtainable in both longitudinal core sections and planar cross-sections (the latter are a function of the axial rotation angle). Special studies on inhomogenity can be applied with core rotation. Detection of ice rafted debris (IRD) is made possible with this option. The processes that run the density measurement system are computer controlled. Besides the absorption measurement the core diameter at every measurement point is determined with a potentiometric system. The data values taken are stored on a personal computer. Before starting routine measurements on the sediment cores, a few experiments conceming the statistical aspects of the gamma-ray signal and its accuracy were carried out. These experiments led to such things as the optimum operational parameters. A high spatial resolution in the mm-range is possible with the 4mm-thin gamma-ray measurements. Within five seconds the wet bulk density can be deterrnined with an absolute accuracy of 1%. A comparison between data measured with the new system and conventional measurements on core samples after core splitting shows an agreement within +I- 5% for most of the values. For this thesis, density determinations were carried out on ten sediment cores. A few sediment characteristics are obtainable from using just the standard measurement results without core rotation. In addition to differentes and steps in the absolute density range, variations in the "frequency" of the density-depth structure can be detected due to the close spatial measurement interval and high resolution. Examples from measurements with small (9°) and great (90°) angle increments show that abrupt and smooth transitional changes of sedirnent layers as well as ice rafted debris of several dimensions can be detected and distiflguished clearly. After the presentation of the wet bulk density results, a comparison with data from other investigations was made. Measurements of the electrical resistivity correlated very well with the density data because both parameters are closely related to the porosity of the sedirnent. Additionally, results from measurements of the magnetic susceptibility and from ultra-sonic wave velocity investigations were considered for a integrative interpretation. The correlation of these both parameters and wet bulk density data is strongly dependent on the local (environmental) conditions. Finally, the densities were compared with recordings from sediment-echographic soundings and an X-ray computer tomography analysis. The individual results of all investigations were then finally combined into an accurate picture of the core. Problems of ambiguity, which exist when just one Parameter is determined alone, can be reduced more or less according to the number of parameters and sedimentary characteristics measured. The important role of the density data among other parameters of such an integrated interpretation is evident. Evidence of this role include the high resolution of the measurement, the excellent accuracy and the key position within methods and parameters concerning marine sediments.
Resumo:
Cultural content on the Web is available in various domains (cultural objects, datasets, geospatial data, moving images, scholarly texts and visual resources), concerns various topics, is written in different languages, targeted to both laymen and experts, and provided by different communities (libraries, archives museums and information industry) and individuals (Figure 1). The integration of information technologies and cultural heritage content on the Web is expected to have an impact on everyday life from the point of view of institutions, communities and individuals. In particular, collaborative environment scan recreate 3D navigable worlds that can offer new insights into our cultural heritage (Chan 2007). However, the main barrier is to find and relate cultural heritage information by end-users of cultural contents, as well as by organisations and communities managing and producing them. In this paper, we explore several visualisation techniques for supporting cultural interfaces, where the role of metadata is essential for supporting the search and communication among end-users (Figure 2). A conceptual framework was developed to integrate the data, purpose, technology, impact, and form components of a collaborative environment, Our preliminary results show that collaborative environments can help with cultural heritage information sharing and communication tasks because of the way in which they provide a visual context to end-users. They can be regarded as distributed virtual reality systems that offer graphically realised, potentially infinite, digital information landscapes. Moreover, collaborative environments also provide a new way of interaction between an end-user and a cultural heritage data set. Finally, the visualisation of metadata of a dataset plays an important role in helping end-users in their search for heritage contents on the Web.
Resumo:
The aim is to obtain computationally more powerful, neuro physiologically founded, artificial neurons and neural nets. Artificial Neural Nets (ANN) of the Perceptron type evolved from the original proposal by McCulloch an Pitts classical paper [1]. Essentially, they keep the computing structure of a linear machine followed by a non linear operation. The McCulloch-Pitts formal neuron (which was never considered by the author’s to be models of real neurons) consists of the simplest case of a linear computation of the inputs followed by a threshold. Networks of one layer cannot compute anylogical function of the inputs, but only those which are linearly separable. Thus, the simple exclusive OR (contrast detector) function of two inputs requires two layers of formal neurons
Resumo:
Nowadays, Internet is a place where social networks have reached an important impact in collaboration among people over the world in different ways. This article proposes a new paradigm for building CSCW business tools following the novel ideas provided by the social web to collaborate and generate awareness. An implementation of these concepts is described, including the components we provide to collaborate in workspaces, (such as videoconference, chat, desktop sharing, forums or temporal events), and the way we generate awareness from these complex social data structures. Figures and validation results are also presented to stress that this architecture has been defined to support awareness generation via joining current and future social data from business and social networks worlds, based on the idea of using social data stored in the cloud.
Resumo:
This paper proposes a novel design of a reconfigurable humanoid robot head, based on biological likeness of human being so that the humanoid robot could agreeably interact with people in various everyday tasks. The proposed humanoid head has a modular and adaptive structural design and is equipped with three main components: frame, neck motion system and omnidirectional stereovision system modules. The omnidirectional stereovision system module being the last module, a motivating contribution with regard to other computer vision systems implemented in former humanoids, it opens new research possibilities for achieving human-like behaviour. A proposal for a real-time catadioptric stereovision system is presented, including stereo geometry for rectifying the system configuration and depth estimation. The methodology for an initial approach for visual servoing tasks is divided into two phases, first related to the robust detection of moving objects, their depth estimation and position calculation, and second the development of attention-based control strategies. Perception capabilities provided allow the extraction of 3D information from a wide range of visions from uncontrolled dynamic environments, and work results are illustrated through a number of experiments.
Resumo:
This paper analyzes the role of Computer Algebra Systems (CAS) in a model of learning based on competences. The proposal is an e-learning model Linear Algebra course for Engineering, which includes the use of a CAS (Maxima) and focuses on problem solving. A reference model has been taken from the Spanish Open University. The proper use of CAS is defined as an indicator of the generic ompetence: Use of Technology. Additionally, we show that using CAS could help to enhance the following generic competences: Self Learning, Planning and Organization, Communication and Writing, Mathematical and Technical Writing, Information Management and Critical Thinking.
Resumo:
El microclima urbano juega un rol importante en el consumo energético de los edificios y en las sensaciones de confort en los espacios exteriores. La urgente necesidad de aumentar la eficiencia energética, reducir las emisiones de los contaminantes y paliar la evidente falta de sostenibilidad que afecta a las ciudades, ha puesto la atención en el urbanismo bioclimático como referente para una propuesta de cambio en la forma de diseñar y vivir la ciudad. Hasta ahora las investigaciones en temas de microclima y eficiencia energética se han concentrado principalmente en como orientar el diseño de nuevos desarrollo. Sin embargo los principales problemas de la insostenibilidad de las actuales conurbaciones son el resultado del modelo de crecimiento especulativo y altamente agotador de recursos que han caracterizado el boom inmobiliario de las últimas décadas. Vemos entonces, tanto en España como en el resto de los Países Europeos, la necesidad de reorientar el sector de la construcción hacía la rehabilitación del espacio construido, como una alternativa capaz de dar una solución más sostenible para el mercado inmobiliario. En este propósito de mejorar la calidad de las ciudades actuales, el espacio público juega un papel fundamental, sobre todo como lugar para el encuentro y la socialización entre los ciudadanos. La sensación térmica condiciona la percepción de un ambiente, así que el microclima puede ser determinante para el éxito o el fracaso de un espacio urbano. Se plantea entonces cómo principal objetivo de la investigación, la definición de estrategias para el diseño bioclimático de los entornos urbanos construidos, fundamentados en las componentes morfotipológica, climática y de los requerimientos de confort para los ciudadanos. Como ulterior elemento de novedad se decide estudiar la rehabilitación de los barrios de construcción de mediado del siglo XX, que en muchos casos constituyen bolsas de degrado en la extendida periferia de las ciudades modernas. La metodología empleada para la investigación se basa en la evaluación de las condiciones climáticas y de confort térmico de diferentes escenarios de proyecto, aplicados a tres casos de estudio situados en un barrio periurbano de la ciudad de Madrid. Para la determinación de los parámetros climáticos se han empleado valores obtenidos con un proceso de simulación computarizada, basados en los principios de fluidodinámica, termodinámica y del intercambio radioactivo en el espacio construido. A través de uso de programas de simulación podemos hacer una previsión de las condiciones microclimáticas de las situaciones actuales y de los efectos de la aplicación de medidas. La gran ventaja en el uso de sistemas de cálculo es que se pueden evaluar diferentes escenarios de proyecto y elegir entre ellos el que asegura mejores prestaciones ambientales. Los resultados obtenidos en los diferentes escenarios han sido comparados con los valores de confort del estado actual, utilizando como indicador de la sensación térmica el índice UTCI. El análisis comparativo ha permitido la realización de una tabla de resumen donde se muestra la evaluación de las diferentes soluciones de rehabilitación. Se ha podido así demostrar que no existe una solución constructiva eficaz para todas las aplicaciones, sino que cada situación debe ser estudiada individualmente, aplicando caso por caso las medidas más oportunas. Si bien los sistemas de simulación computarizada pueden suponer un importante apoyo para la fase de diseño, es responsabilidad del proyectista emplear las herramientas más adecuadas en cada fase y elegir las soluciones más oportunas para cumplir con los objetivos del proyecto. The urban microclimate plays an important role on buildings energy consumption and comfort sensation in exterior spaces. Nowadays, cities need to increase energy efficiency, reduce the pollutants emissions and mitigate the evident lack of sustainability. In reason of this, attention has focused on the bioclimatic urbanism as a reference of change proposal of the way to design and live the city. Hitherto, the researches on microclimate and energy efficiency have mainly concentrated on guiding the design of new constructions. However the main problems of unsustainability of existing conurbations are the result of the growth model highly speculative and responsible of resources depletion that have characterized the real estate boom of recent decades. In Spain and other European countries, become define the need to redirect the construction sector towards urban refurbishment. This alternative is a more sustainable development model and is able to provide a solution for the real estate sector. In order to improve the quality of today's cities, the public space plays a key role, especially in order to provide to citizens places for meeting and socializing. The thermal sensation affects the environment perception, so microclimate conditions can be decisive for the success or failure of urban space. For this reasons, the main objective of this work is focused on the definition of bioclimatic strategies for existing urban spaces, based on the morpho-typological components, climate and comfort requirements for citizens. As novelty element, the regeneration of neighborhoods built in middle of the twentieth century has been studied, because are the major extended in periphery of modern cities and, in many cases, they represent deprived areas. The research methodology is based on the evaluation of climatic conditions and thermal comfort of different project scenarios, applied to three case studies located in a suburban neighborhood of Madrid. The climatic parameters have been obtained by computer simulation process, based on fluid dynamics, thermodynamics and radioactive exchange in urban environment using numerical approach. The great advantage in the use of computing systems is the capacity for evaluate different project scenarios. The results in the different scenarios were compared with the comfort value obtained in the current state, using the UTCI index as indicator of thermal sensation. Finally, an abacus of the thermal comfort improvement obtained by different countermeasures has been performed. One of the major achievement of doctoral work is the demonstration of there are not any design solution suitable for different cases. Each situation should be analyzed and specific design measures should be proposed. Computer simulation systems can be a significant support and help the designer in the decision making phase. However, the election of the most suitable tools and the appropriate solutions for each case is designer responsibility.
Resumo:
The term "Smart Product" has become commonly used in recent years. This is because there has been an increasing interest in these kinds of products as part of the consumer goods industry, impacting everyday life and industry. Nevertheless, the term "Smart Product" is used with different meanings in different contexts and application domains. The use of the term "Smart Product" with different meanings and underlying semantics can create important misunderstandings and dissent. The aim of this paper is to analyze the different definitions of Smart Product available in the literature, and to explore and analyze their commonalities and differences, in order to provide a consensus definition that satisfies, and can therefore be used by, all parties. To embrace the identified definitions, the concept of "Smart Thing" is introduced. The methodology used was a systematic literature review. The definition is expressed as an ontology.
Resumo:
All the interconnected regulated systems are prone to impedance-based interactions making them sensitive to instability and transient-performance degradation. The applied control method affects significantly the characteristics of the converter in terms of sensitivity to different impedance interactions. This paper provides for the first time the whole set of impedance-type internal parameters and the formulas according to which the interaction sensitivity can be fully explained and analyzed. The formulation given in this paper can be utilized equally either based on measured frequency responses or on predicted analytic transfer functions. Usually, the distributed dc-dc systems are constructed by using ready-made power modules without having thorough knowledge on the actual power-stage and control-system designs. As a consequence, the interaction characterization has to be based on the frequency responses measureable via the input and output terminals. A buck converter with four different control methods is experimentally characterized in frequency domain to demonstrate the effect of control method on the interaction sensitivity. The presented analytical models are used to explain the phenomena behind the changes in the interaction sensitivity.
Resumo:
Over the past 20 years,theuse of Computer Algebra Systems(CAS) has helped with the teaching of mathematics inengineer-ingschools. However the traditional use of CAS only in math labs has led to a narrow view by the student: the CAS is an additional work, not included in the learning process. The didactic guidelines of the European Higher Education Area(EHEA) propose a new teaching–learning model based on competencies. We suggest the use of the CAS be adapted to the new rules. In this paper,we present a model for the integrated use of the CAS,and we describe and analyze two experiments carried out in the academic year2011–2012. Our analysis suggests that the use of CAS in all learning and assessment activities has the potential to positively influence the development of competencies.