830 resultados para Ubiquitous and pervasive computing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissociation of molecular hydrogen is an important step in a wide variety of chemical, biological, and physical processes. Due to the light mass of hydrogen, it is recognized that quantum effects are often important to its reactivity. However, understanding how quantum effects impact the reactivity of hydrogen is still in its infancy. Here, we examine this issue using a well-defined Pd/Cu(111) alloy that allows the activation of hydrogen and deuterium molecules to be examined at individual Pd atom surface sites over a wide range of temperatures. Experiments comparing the uptake of hydrogen and deuterium as a function of temperature reveal completely different behavior of the two species. The rate of hydrogen activation increases at lower sample temperature, whereas deuterium activation slows as the temperature is lowered. Density functional theory simulations in which quantum nuclear effects are accounted for reveal that tunneling through the dissociation barrier is prevalent for H2 up to ∼190 K and for D2 up to ∼140 K. Kinetic Monte Carlo simulations indicate that the effective barrier to H2 dissociation is so low that hydrogen uptake on the surface is limited merely by thermodynamics, whereas the D2 dissociation process is controlled by kinetics. These data illustrate the complexity and inherent quantum nature of this ubiquitous and seemingly simple chemical process. Examining these effects in other systems with a similar range of approaches may uncover temperature regimes where quantum effects can be harnessed, yielding greater control of bond-breaking processes at surfaces and uncovering useful chemistries such as selective bond activation or isotope separation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tensor analysis plays an important role in modern image and vision computing problems. Most of the existing tensor analysis approaches are based on the Frobenius norm, which makes them sensitive to outliers. In this paper, we propose L1-norm-based tensor analysis (TPCA-L1), which is robust to outliers. Experimental results upon face and other datasets demonstrate the advantages of the proposed approach. © 2006 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many systems and applications are continuously producing events. These events are used to record the status of the system and trace the behaviors of the systems. By examining these events, system administrators can check the potential problems of these systems. If the temporal dynamics of the systems are further investigated, the underlying patterns can be discovered. The uncovered knowledge can be leveraged to predict the future system behaviors or to mitigate the potential risks of the systems. Moreover, the system administrators can utilize the temporal patterns to set up event management rules to make the system more intelligent. With the popularity of data mining techniques in recent years, these events grad- ually become more and more useful. Despite the recent advances of the data mining techniques, the application to system event mining is still in a rudimentary stage. Most of works are still focusing on episodes mining or frequent pattern discovering. These methods are unable to provide a brief yet comprehensible summary to reveal the valuable information from the high level perspective. Moreover, these methods provide little actionable knowledge to help the system administrators to better man- age the systems. To better make use of the recorded events, more practical techniques are required. From the perspective of data mining, three correlated directions are considered to be helpful for system management: (1) Provide concise yet comprehensive summaries about the running status of the systems; (2) Make the systems more intelligence and autonomous; (3) Effectively detect the abnormal behaviors of the systems. Due to the richness of the event logs, all these directions can be solved in the data-driven manner. And in this way, the robustness of the systems can be enhanced and the goal of autonomous management can be approached. This dissertation mainly focuses on the foregoing directions that leverage tem- poral mining techniques to facilitate system management. More specifically, three concrete topics will be discussed, including event, resource demand prediction, and streaming anomaly detection. Besides the theoretic contributions, the experimental evaluation will also be presented to demonstrate the effectiveness and efficacy of the corresponding solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autism Spectrum Disorder () is defined as “the presence of severe and pervasive impairments in reciprocal social interaction and in verbal and nonverbal communication skills” (Diagnostic & Statistical Manual, 2000). It is estimated that 1 in 68 children across the United States are diagnosed with ASD. One of the most common delays that children diagnosed with ASD experience are language delays. Children with ASD that have a language delay will often develop maladaptive behaviors as a result of poor communication skills (Carr & Durand, 1985). The failure to develop mand acquisition in typical fashion results in behaviors ranging from social withdrawal to self-injurious behaviors (Cooper et. al, 2007). A lack of a strong tact repertoire can further impede and complicate the learning of other necessary components of language due to the inability to successfully label items and events in the physical environment of the child. The purpose of this study is to replicate with a reversal in verbal operant training of the procedures described in Wallace et al. (2006) in which two children with ASD underwent tact training to facilitate the formation of mands; essentially this study aims to accomplish mand training first to establish as tact. It is hypothesized that mand training will result in a greater repertoire of tacts due to strength of the relationship between mands and the control over the social environment (Cooper et al., 2007). The two children in the study will be taught to mand items that will be ranked in order of preference via stimulus preference assessment. This study is of great importance due to the indispensable value of effective social communication skills. Data gathered on improving communication skills is of great value to the ASD community as the implications for functional skills result in better communication with family and greater control of individual functioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autism Spectrum Disorder () is defined as “the presence of severe and pervasive impairments in reciprocal social interaction and in verbal and nonverbal communication skills” (Diagnostic & Statistical Manual, 2000). It is estimated that 1 in 68 children across the United States are diagnosed with ASD. One of the most common delays that children diagnosed with ASD experience are language delays. Children with ASD that have a language delay will often develop maladaptive behaviors as a result of poor communication skills (Carr & Durand, 1985). The failure to develop mand acquisition in typical fashion results in behaviors ranging from social withdrawal to self-injurious behaviors (Cooper et. al, 2007). A lack of a strong tact repertoire can further impede and complicate the learning of other necessary components of language due to the inability to successfully label items and events in the physical environment of the child. The purpose of this study is to replicate with a reversal in verbal operant training of the procedures described in Wallace et al. (2006) in which two children with ASD underwent tact training to facilitate the formation of mands; essentially this study aims to accomplish mand training first to establish as tact. It is hypothesized that mand training will result in a greater repertoire of tacts due to strength of the relationship between mands and the control over the social environment (Cooper et al., 2007). The two children in the study will be taught to mand items that will be ranked in order of preference via stimulus preference assessment. This study is of great importance due to the indispensable value of effective social communication skills. Data gathered on improving communication skills is of great value to the ASD community as the implications for functional skills result in better communication with family and greater control of individual functioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method of accurately controlling the position of a mobile robot using an external Large Volume Metrology (LVM) instrument is presented in this paper. Utilizing a LVM instrument such as the laser tracker in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real- Time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitization scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dimensional and form inspections are key to the manufacturing and assembly of products. Product verification can involve a number of different measuring instruments operated using their dedicated software. Typically, each of these instruments with their associated software is more suitable for the verification of a pre-specified quality characteristic of the product than others. The number of different systems and software applications to perform a complete measurement of products and assemblies within a manufacturing organisation is therefore expected to be large. This number becomes even larger as advances in measurement technologies are made. The idea of a universal software application for any instrument still appears to be only a theoretical possibility. A need for information integration is apparent. In this paper, a design of an information system to consistently manage (store, search, retrieve, search, secure) measurement results from various instruments and software applications is introduced. Two of the main ideas underlying the proposed system include abstracting structures and formats of measurement files from the data so that complexity and compatibility between different approaches to measurement data modelling is avoided. Secondly, the information within a file is enriched with meta-information to facilitate its consistent storage and retrieval. To demonstrate the designed information system, a web application is implemented. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integration of the measurement activity into the production process is an essential rule in digital enterprise technology, especially for large volume product manufacturing, such as aerospace, shipbuilding, power generation and automotive industries. Measurement resource planning is a structured method of selecting and deploying necessary measurement resources to implement quality aims of product development. In this research, a new mapping approach for measurement resource planning is proposed. Firstly, quality aims are identified in the form of a number of specifications and engineering requirements of one quality characteristics (QCs) at a specific stage of product life cycle, and also measurement systems are classified according to the attribute of QCs. Secondly, a matrix mapping approach for measurement resource planning is outlined together with an optimization algorithm for combination between quality aims and measurement systems. Finally, the proposed methodology has been studied in shipbuilding to solve the problem of measurement resource planning, by which the measurement resources are deployed to satisfy all the quality aims. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Product quality planning is a fundamental part of quality assurance in manufacturing. It is composed of the distribution of quality aims over each phase in product development and the deployment of quality operations and resources to accomplish these aims. This paper proposes a quality planning methodology based on risk assessment and the planning tasks of product development are translated into evaluation of risk priorities. Firstly, a comprehensive model for quality planning is developed to address the deficiencies of traditional quality function deployment (QFD) based quality planning. Secondly, a novel failure knowledge base (FKB) based method is discussed. Then a mathematical method and algorithm of risk assessment is presented for target decomposition, measure selection, and sequence optimization. Finally, the proposed methodology has been implemented in a web based prototype software system, QQ-Planning, to solve the problem of quality planning regarding the distribution of quality targets and the deployment of quality resources, in such a way that the product requirements are satisfied and the enterprise resources are highly utilized. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper details a method of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans are profoundly affected by the surroundings which they inhabit. Environmental psychologists have produced numerous credible theories describing optimal human environments, based on the concept of congruence or “fit” (1, 2). Lack of person/environment fit can lead to stress-related illness and lack of psychosocial well-being (3). Conversely, appropriately designed environments can promote wellness (4) or “salutogenesis” (5). Increasingly, research in the area of Evidence-Based Design, largely concentrated in the area of healthcare architecture, has tended to bear out these theories (6). Patients and long-term care residents, because of injury, illness or physical/ cognitive impairment, are less likely to be able to intervene to modify their immediate environment, unless this is designed specifically to facilitate their particular needs. In the context of care settings, detailed design of personal space therefore takes on enormous significance. MyRoom conceptualises a personalisable room, utilising sensoring and networked computing to enable the environment to respond directly and continuously to the occupant. Bio-signals collected and relayed to the system will actuate application(s) intended to positively influence user well-being. Drawing on the evidence base in relation to therapeutic design interventions (7), real-time changes in ambient lighting, colour, image, etc. respond continuously to the user’s physiological state, optimising congruence. Based on research evidence, consideration is also given to development of an application which uses natural images (8). It is envisaged that actuation will require machine-learning based on interpretation of data gathered by sensors; sensoring arrangements may vary depending on context and end-user. Such interventions aim to reduce inappropriate stress/ provide stimulation, supporting both instrumental and cognitive tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we study aspects of (0,2) superconformal field theories (SCFTs), which are suitable for compactification of the heterotic string. In the first part, we study a class of (2,2) SCFTs obtained by fibering a Landau-Ginzburg (LG) orbifold CFT over a compact K\"ahler base manifold. While such models are naturally obtained as phases in a gauged linear sigma model (GLSM), our construction is independent of such an embedding. We discuss the general properties of such theories and present a technique to study the massless spectrum of the associated heterotic compactification. We test the validity of our method by applying it to hybrid phases of GLSMs and comparing spectra among the phases. In the second part, we turn to the study of the role of accidental symmetries in two-dimensional (0,2) SCFTs obtained by RG flow from (0,2) LG theories. These accidental symmetries are ubiquitous, and, unlike in the case of (2,2) theories, their identification is key to correctly identifying the IR fixed point and its properties. We develop a number of tools that help to identify such accidental symmetries in the context of (0,2) LG models and provide a conjecture for a toric structure of the SCFT moduli space in a large class of models. In the final part, we study the stability of heterotic compactifications described by (0,2) GLSMs with respect to worldsheet instanton corrections to the space-time superpotential following the work of Beasley and Witten. We show that generic models elude the vanishing theorem proved there, and may not determine supersymmetric heterotic vacua. We then construct a subclass of GLSMs for which a vanishing theorem holds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le but de cette thèse est d’explorer le potentiel sismique des étoiles naines blanches pulsantes, et en particulier celles à atmosphères riches en hydrogène, les étoiles ZZ Ceti. La technique d’astérosismologie exploite l’information contenue dans les modes normaux de vibration qui peuvent être excités lors de phases particulières de l’évolution d’une étoile. Ces modes modulent le flux émergent de l’étoile pulsante et se manifestent principalement en termes de variations lumineuses multi-périodiques. L’astérosismologie consiste donc à examiner la luminosité d’étoiles pulsantes en fonction du temps, afin d’en extraire les périodes, les amplitudes apparentes, ainsi que les phases relatives des modes de pulsation détectés, en utilisant des méthodes standards de traitement de signal, telles que des techniques de Fourier. L’étape suivante consiste à comparer les périodes de pulsation observées avec des périodes générées par un modèle stellaire en cherchant l’accord optimal avec un modèle physique reconstituant le plus fidèlement possible l’étoile pulsante. Afin d’assurer une recherche optimale dans l’espace des paramètres, il est nécessaire d’avoir de bons modèles physiques, un algorithme d’optimisation de comparaison de périodes efficace, et une puissance de calcul considérable. Les périodes des modes de pulsation de modèles stellaires de naines blanches peuvent être généralement calculées de manière précise et fiable sur la base de la théorie linéaire des pulsations stellaires dans sa version adiabatique. Afin de définir dans son ensemble un modèle statique de naine blanche propre à l’analyse astérosismologique, il est nécessaire de spécifier la gravité de surface, la température effective, ainsi que différents paramètres décrivant la disposition en couche de l’enveloppe. En utilisant parallèlement les informations obtenues de manière indépendante (température effective et gravité de surface) par la méthode spectroscopique, il devient possible de vérifier la validité de la solution obtenue et de restreindre de manière remarquable l’espace des paramètres. L’exercice astérosismologique, s’il est réussi, mène donc à la détermination précise des paramètres de la structure globale de l’étoile pulsante et fournit de l’information unique sur sa structure interne et l’état de sa phase évolutive. On présente dans cette thèse l’analyse complète réussie, de l’extraction des fréquences à la solution sismique, de quatre étoiles naines blanches pulsantes. Il a été possible de déterminer les paramètres structuraux de ces étoiles et de les comparer remarquablement à toutes les contraintes indépendantes disponibles dans la littérature, mais aussi d’inférer sur la dynamique interne et de reconstruire le profil de rotation interne. Dans un premier temps, on analyse le duo d’étoiles ZZ Ceti, GD 165 et Ross 548, afin de comprendre les différences entre leurs propriétés de pulsation, malgré le fait qu’elles soient des étoiles similaires en tout point, spectroscopiquement parlant. L’analyse sismique révèle des structures internes différentes, et dévoile la sensibilité de certains modes de pulsation à la composition interne du noyau de l’étoile. Afin de palier à cette sensibilité, nouvellement découverte, et de rivaliser avec les données de qualité exceptionnelle que nous fournissent les missions spatiales Kepler et Kepler2, on développe une nouvelle paramétrisation des profils chimiques dans le coeur, et on valide la robustesse de notre technique et de nos modèles par de nombreux tests. Avec en main la nouvelle paramétrisation du noyau, on décroche enfin le ”Saint Graal” de l’astérosismologie, en étant capable de reproduire pour la première fois les périodes observées à la précision des observations, dans le cas de l’étude sismique des étoiles KIC 08626021 et de GD 1212.