18 resultados para Remote Data Acquisition and Storage
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Nowadays, there are several services and applications that allow users to locate and move to different tourist areas using a mobile device. These systems can be used either by internet or downloading an application in concrete places like a visitors centre. Although such applications are able to facilitate the location and the search for points of interest, in most cases, these services and applications do not meet the needs of each user. This paper aims to provide a solution by studying the main projects, services and applications, their routing algorithms and their treatment of the real geographical data in Android mobile devices, focusing on the data acquisition and treatment to improve the routing searches in off-line environments.
Resumo:
This article describes a method for determining the polydispersity index Ip2=Mz/Mw of the molecular weight distribution (MWD) of linear polymeric materials from linear viscoelastic data. The method uses the Mellin transform of the relaxation modulus of a simple molecular rheological model. One of the main features of this technique is that it enables interesting MWD information to be obtained directly from dynamic shear experiments. It is not necessary to achieve the relaxation spectrum, so the ill-posed problem is avoided. Furthermore, a determinate shape of the continuous MWD does not have to be assumed in order to obtain the polydispersity index. The technique has been developed to deal with entangled linear polymers, whatever the form of the MWD is. The rheological information required to obtain the polydispersity index is the storage G′(ω) and loss G″(ω) moduli, extending from the terminal zone to the plateau region. The method provides a good agreement between the proposed theoretical approach and the experimental polydispersity indices of several linear polymers for a wide range of average molecular weights and polydispersity indices. It is also applicable to binary blends.
Resumo:
Peer-reviewed
Resumo:
The application of compositional data analysis through log ratio trans-formations corresponds to a multinomial logit model for the shares themselves.This model is characterized by the property of Independence of Irrelevant Alter-natives (IIA). IIA states that the odds ratio in this case the ratio of shares is invariant to the addition or deletion of outcomes to the problem. It is exactlythis invariance of the ratio that underlies the commonly used zero replacementprocedure in compositional data analysis. In this paper we investigate using thenested logit model that does not embody IIA and an associated zero replacementprocedure and compare its performance with that of the more usual approach ofusing the multinomial logit model. Our comparisons exploit a data set that com-bines voting data by electoral division with corresponding census data for eachdivision for the 2001 Federal election in Australia
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
Before firms decide whether to enter a new market or not, they havethe opportunity to buy information about several variables that might affectthe profitability of this market. Our model differs from the existing literatureon endogenous information acquisition in two respects: (1) there is uncertaintyabout more than one variable, and (2) information is acquired secretly. Whenthe cost of acquiring information is small, entry decisions will be as ifthere was perfect information. Equilibria where each firm acquires only asmall amount of information are more robust than the socially undesirableequilibria where all firms gather all information. Examples illustrate theimportance of assumptions (1) and (2).
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
The Powell Basin is a small oceanic basin located at the NE end of the Antarctic Peninsula developed during the Early Miocene and mostly surrounded by the continental crusts of the South Orkney Microcontinent, South Scotia Ridge and Antarctic Peninsula margins. Gravity data from the SCAN 97 cruise obtained with the R/V Hespérides and data from the Global Gravity Grid and Sea Floor Topography (GGSFT) database (Sandwell and Smith, 1997) are used to determine the 3D geometry of the crustal-mantle interface (CMI) by numerical inversion methods. Water layer contribution and sedimentary effects were eliminated from the Free Air anomaly to obtain the total anomaly. Sedimentary effects were obtained from the analysis of existing and new SCAN 97 multichannel seismic profiles (MCS). The regional anomaly was obtained after spectral and filtering processes. The smooth 3D geometry of the crustal mantle interface obtained after inversion of the regional anomaly shows an increase in the thickness of the crust towards the continental margins and a NW-SE oriented axis of symmetry coinciding with the position of an older oceanic spreading axis. This interface shows a moderate uplift towards the western part and depicts two main uplifts to the northern and eastern sectors.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
A beautiful smile is directly related with white teeth. Nowadays oral care has increased and developed processes for beautiful smiles. Dental bleaching is frequently used in odontology, not just for health care also for aesthetic treatment. With the possibility of teeth bleaching, now the importance is in, how white the tooth is? Because color is relate to an individual perception. In order to assets teeth correct color identification has been developed many color guides, models, spaces and analytical methods. Spite all of these useful tools the color interpretation depends on environmental factors, position of the sample in the data acquisition and most importantly the instrument sensitivity. The commons methods have proved to be useful. They are easy to handle, some are portable but they do not have a high sensitivity. The present work is based on the integration of a new analytical technique for color acquisition. High spectral Image (HSI) is able to performed image analysis with high quality and efficiency. HSI is used in many fields and we used it for color image analysis within the bleaching process. The main comparison was done with the HSI and the colorimeter through the processes of two different bleaching protocols. The results showed that HSI has higher sensitivity than the colorimeter. During the analysis the dental surface with the HSI we were able to notice surface changes. These changes were analyzed by roughness studies.
Resumo:
Trabajo de investigación que realiza un estudio clasificatorio de las asignaturas matriculadas en la carrera de Administración y Dirección de Empresas de la UOC en relación a su resultado. Se proponen diferentes métodos y modelos de comprensión del entorno en el que se realiza el estudio.
Resumo:
Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.
Resumo:
Marketing scholars have suggested a need for more empirical research on consumer response to malls, in order to have a better understanding of the variables that explain the behavior of the consumers. The segmentation methodology CHAID (Chi-square automatic interaction detection) was used in order to identify the profiles of consumers with regard to their activities at malls, on the basis of socio-demographic variables and behavioral variables (how and with whom they go to the malls). A sample of 790 subjects answered an online questionnaire. The CHAID analysis of the results was used to identify the profiles of consumers with regard to their activities at malls. In the set of variables analyzed the transport used in order to go shopping and the frequency of visits to centers are the main predictors of behavior in malls. The results provide guidelines for the development of effective strategies to attract consumers to malls and retain them there.