896 resultados para Pattern-based interaction models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Point pattern matching in Euclidean Spaces is one of the fundamental problems in Pattern Recognition, having applications ranging from Computer Vision to Computational Chemistry. Whenever two complex patterns are encoded by two sets of points identifying their key features, their comparison can be seen as a point pattern matching problem. This work proposes a single approach to both exact and inexact point set matching in Euclidean Spaces of arbitrary dimension. In the case of exact matching, it is assured to find an optimal solution. For inexact matching (when noise is involved), experimental results confirm the validity of the approach. We start by regarding point pattern matching as a weighted graph matching problem. We then formulate the weighted graph matching problem as one of Bayesian inference in a probabilistic graphical model. By exploiting the existence of fundamental constraints in patterns embedded in Euclidean Spaces, we prove that for exact point set matching a simple graphical model is equivalent to the full model. It is possible to show that exact probabilistic inference in this simple model has polynomial time complexity with respect to the number of elements in the patterns to be matched. This gives rise to a technique that for exact matching provably finds a global optimum in polynomial time for any dimensionality of the underlying Euclidean Space. Computational experiments comparing this technique with well-known probabilistic relaxation labeling show significant performance improvement for inexact matching. The proposed approach is significantly more robust under augmentation of the sizes of the involved patterns. In the absence of noise, the results are always perfect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ubiquitous computing raises new usability challenges that cut across design and development. We are particularly interested in environments enhanced with sensors, public displays and personal devices. How can prototypes be used to explore the users' mobility and interaction, both explicitly and implicitly, to access services within these environments? Because of the potential cost of development and design failure, these systems must be explored using early assessment techniques and versions of the systems that could disrupt if deployed in the target environment. These techniques are required to evaluate alternative solutions before making the decision to deploy the system on location. This is crucial for a successful development, that anticipates potential user problems, and reduces the cost of redesign. This thesis reports on the development of a framework for the rapid prototyping and analysis of ubiquitous computing environments that facilitates the evaluation of design alternatives. It describes APEX, a framework that brings together an existing 3D Application Server with a modelling tool. APEX-based prototypes enable users to navigate a virtual world simulation of the envisaged ubiquitous environment. By this means users can experience many of the features of the proposed design. Prototypes and their simulations are generated in the framework to help the developer understand how the user might experience the system. These are supported through three different layers: a simulation layer (using a 3D Application Server); a modelling layer (using a modelling tool) and a physical layer (using external devices and real users). APEX allows the developer to move between these layers to evaluate different features. It supports exploration of user experience through observation of how users might behave with the system as well as enabling exhaustive analysis based on models. The models support checking of properties based on patterns. These patterns are based on ones that have been used successfully in interactive system analysis in other contexts. They help the analyst to generate and verify relevant properties. Where these properties fail then scenarios suggested by the failure provide an important aid to redesign.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing offer of education services in Brazil, it is necessary to evaluate the quality of service in education, especially in those institutions for vocational education which have a greater interaction with the labor market, in order to form qualified professionals and meet the growing demand that the country has today[A1] . In Brazil, the evaluation of the quality of library services has influenced the assessment of educational institutions and in this context, there needs to be a process to monitor the quality of services provided by libraries. However, the service is not done in a single moment and thus to a more detailed assessment it needs to be measured and evaluated each different time the customer uses it. Therefore, the aim of this work consists in measuring the quality in every moment of truth of a cycle of library services to assess which are the most relevant moments in the client's perspective at the library of the Federal Institute of Education, Science and Technology of Rio Grande do Norte (IFRN) Campus João Câmara in building the overall quality of service. In the literature review, internal secondary sources were used, from the database of the institution studied, and also external sources, through literature in books, articles, dissertations, theses and journals on compost quality, service quality, cycle services, measuring quality, satisfaction, teaching activities, and on library services specifically. We applied a questionnaire to students in the library based on models of quality measurement SERVPERF and SERVQUAL and its variations such as SERVQUAL pondered and SERVPERF pondered . Through analysis based on concepts of reliability and validity of measuring instruments, it was found that the SERVPERF model is the instrument that most closely matches the dimensions of quality assessed in the library with customer satisfaction measured by the questionnaire. From there, the search results as measured by statistical techniques of analysis, indicated that the initial and final moments of truth of the cycle of service quality had the greatest influence on overall customer satisfaction with the library service

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The phenomenon of adsorption is of fundamental importance for the treatment of textile effluents and removal of dyes. Chitosan is characterized as an excellent adsorbent material, not only for its adsorption capacity but also the low cost production. Equilibrium and kinetic studies were developed in this study to describe the mechanism of adsorption of the anionic azo dye Orange G in chitosan, with the isotherms obtained from the variation of the concentration of dye in the continuous phase. The kinetics of the process was analyzed based on models involving the adsorption of molecules of the dye in nonpolar and polar sites. Adsorption experiments were carried out in water and in saline media with different NaCl concentrations, both for the determination of the equilibrium time as isotherms for making kinetic curves in which the amount of dye adsorbed measured indirectly varied with time. The experiments revealed the opening of the biopolymer structure with increasing concentration of Orange G, accompanied by high pH values and change on the type of interaction between the dye and the adsorbent surface, suggesting behavior advocated by the Langmuir equation in a certain range of concentration of the adsorbate and following the Henry's Law at higher concentrations, from the increased number of sites available for adsorption. The studies conducted showed that the saline medium reduces the chitosan s adsorption capacity according to a certain concentration, the occurrence of the cooperative adsorption process steps kinetic mechanism suggested as a new alternative for the interpretation of the phenomenon

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the development of an application created to assist the teaching of dental structures, generate rich content information and different manners of interaction. An ontology was created to provide semantics informations for virtual models. We also used two devices gesture-based interaction: Kinect and Wii Remote. It was developed a system which use intuitive interaction, and it is able to generate three dimensional images, making the experience of teaching / learning motivating. The projection environment used by the system was called Mini CAVE. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Música - IA

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Several studies in Drosophila have shown excessive movement of retrogenes from the X chromosome to autosomes, and that these genes are frequently expressed in the testis. This phenomenon has led to several hypotheses invoking natural selection as the process driving male-biased genes to the autosomes. Metta and Schlotterer (BMC Evol Biol 2010, 10:114) analyzed a set of retrogenes where the parental gene has been subsequently lost. They assumed that this class of retrogenes replaced the ancestral functions of the parental gene, and reported that these retrogenes, although mostly originating from movement out of the X chromosome, showed female-biased or unbiased expression. These observations led the authors to suggest that selective forces (such as meiotic sex chromosome inactivation and sexual antagonism) were not responsible for the observed pattern of retrogene movement out of the X chromosome. Results: We reanalyzed the dataset published by Metta and Schlotterer and found several issues that led us to a different conclusion. In particular, Metta and Schlotterer used a dataset combined with expression data in which significant sex-biased expression is not detectable. First, the authors used a segmental dataset where the genes selected for analysis were less testis-biased in expression than those that were excluded from the study. Second, sex-biased expression was defined by comparing male and female whole-body data and not the expression of these genes in gonadal tissues. This approach significantly reduces the probability of detecting sex-biased expressed genes, which explains why the vast majority of the genes analyzed (parental and retrogenes) were equally expressed in both males and females. Third, the female-biased expression observed by Metta and Schltterer is mostly found for parental genes located on the X chromosome, which is known to be enriched with genes with female-biased expression. Fourth, using additional gonad expression data, we found that autosomal genes analyzed by Metta and Schlotterer are less up regulated in ovaries and have higher chance to be expressed in meiotic cells of spermatogenesis when compared to X-linked genes. Conclusions: The criteria used to select retrogenes and the sex-biased expression data based on whole adult flies generated a segmental dataset of female-biased and unbiased expressed genes that was unable to detect the higher propensity of autosomal retrogenes to be expressed in males. Thus, there is no support for the authors' view that the movement of new retrogenes, which originated from X-linked parental genes, was not driven by selection. Therefore, selection-based genetic models remain the most parsimonious explanations for the observed chromosomal distribution of retrogenes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Actual trends in software development are pushing the need to face a multiplicity of diverse activities and interaction styles characterizing complex and distributed application domains, in such a way that the resulting dynamics exhibits some grade of order, i.e. in terms of evolution of the system and desired equilibrium. Autonomous agents and Multiagent Systems are argued in literature as one of the most immediate approaches for describing such a kind of challenges. Actually, agent research seems to converge towards the definition of renewed abstraction tools aimed at better capturing the new demands of open systems. Besides agents, which are assumed as autonomous entities purposing a series of design objectives, Multiagent Systems account new notions as first-class entities, aimed, above all, at modeling institutional/organizational entities, placed for normative regulation, interaction and teamwork management, as well as environmental entities, placed as resources to further support and regulate agent work. The starting point of this thesis is recognizing that both organizations and environments can be rooted in a unifying perspective. Whereas recent research in agent systems seems to account a set of diverse approaches to specifically face with at least one aspect within the above mentioned, this work aims at proposing a unifying approach where both agents and their organizations can be straightforwardly situated in properly designed working environments. In this line, this work pursues reconciliation of environments with sociality, social interaction with environment based interaction, environmental resources with organizational functionalities with the aim to smoothly integrate the various aspects of complex and situated organizations in a coherent programming approach. Rooted in Agents and Artifacts (A&A) meta-model, which has been recently introduced both in the context of agent oriented software engineering and programming, the thesis promotes the notion of Embodied Organizations, characterized by computational infrastructures attaining a seamless integration between agents, organizations and environmental entities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the framework of developing defect-based life models, in which breakdown is explicitly associated with partial discharge (PD)-induced damage growth from a defect, ageing tests and PD measurements were carried out in the lab on polyethylene (PE) layered specimens containing artificial cavities. PD activity was monitored continuously during aging. A quasi-deterministic series of stages can be observed in the behavior of the main PD parameters (i.e. discharge repetition rate and amplitude). Phase-resolved PD patterns at various ageing stages were reproduced by numerical simulation which is based on a physical discharge model devoid of adaptive parameters. The evolution of the simulation parameters provides insight into the physical-chemical changes taking place at the dielectric/cavity interface during the aging process. PD activity shows similar time behavior under constant cavity gas volume and constant cavity gas pressure conditions, suggesting that the variation of PD parameters may not be attributed to the variation of the gas pressure. Brownish PD byproducts, consisting of oxygen containing moieties, and degradation pits were found at the dielectric/cavity interface. It is speculated that the change of PD activity is related to the composition of the cavity gas, as well as to the properties of dielectric/cavity interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Until few years ago, 3D modelling was a topic confined into a professional environment. Nowadays technological innovations, the 3D printer among all, have attracted novice users to this application field. This sudden breakthrough was not supported by adequate software solutions. The 3D editing tools currently available do not assist the non-expert user during the various stages of generation, interaction and manipulation of 3D virtual models. This is mainly due to the current paradigm that is largely supported by two-dimensional input/output devices and strongly affected by obvious geometrical constraints. We have identified three main phases that characterize the creation and management of 3D virtual models. We investigated these directions evaluating and simplifying the classic editing techniques in order to propose more natural and intuitive tools in a pure 3D modelling environment. In particular, we focused on freehand sketch-based modelling to create 3D virtual models, interaction and navigation in a 3D modelling environment and advanced editing tools for free-form deformation and objects composition. To pursuing these goals we wondered how new gesture-based interaction technologies can be successfully employed in a 3D modelling environments, how we could improve the depth perception and the interaction in 3D environments and which operations could be developed to simplify the classical virtual models editing paradigm. Our main aims were to propose a set of solutions with which a common user can realize an idea in a 3D virtual model, drawing in the air just as he would on paper. Moreover, we tried to use gestures and mid-air movements to explore and interact in 3D virtual environment, and we studied simple and effective 3D form transformations. The work was carried out adopting the discrete representation of the models, thanks to its intuitiveness, but especially because it is full of open challenges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification of molecular processes involved in cancer development and prognosis opened avenues for targeted therapies, which made treatment more tumor-specific and less toxic than conventional therapies. One important example is the epidermal growth factor receptor (EGFR) and EGFR-specific inhibitors (i.e. erlotinib). However, challenges such as drug resistance still remain in targeted therapies. Therefore, novel candidate compounds and new strategies are needed for improvement of therapy efficacy. Shikonin and its derivatives are cytotoxic constituents in traditional Chinese herbal medicine Zicao (Lithospermum erythrorhizin). In this study, we investigated the molecular mechanisms underlying the anti-cancer effects of shikonin and its derivatives in glioblastoma cells and leukemia cells. Most of shikonin derivatives showed strong cytotoxicity towards erlotinib-resistant glioblastoma cells, especially U87MG.ΔEGFR cells which overexpressed a deletion-activated EGFR (ΔEGFR). Moreover, shikonin and some derivatives worked synergistically with erlotinib in killing EGFR-overexpressing cells. Combination treatment with shikonin and erlotinib overcame the drug resistance of these cells to erlotinib. Western blotting analysis revealed that shikonin inhibited ΔEGFR phosphorylation and led to corresponding decreases in phosphorylation of EGFR downstream molecules. By means of Loewe additivity and Bliss independence drug interaction models, we found erlotinb and shikonin or its derivatives corporately suppressed ΔEGFR phosphorylation. We believed this to be a main mechanism responsible for their synergism in U87MG.ΔEGFR cells. In leukemia cells, which did not express EGFR, shikonin and its derivatives exhibited even greater cytotoxicity, suggesting the existence of other mechanisms. Microarray-based gene expression analysis uncovered the transcription factor c-MYC as the commonly deregulated molecule by shikonin and its derivatives. As validated by Western blotting analysis, DNA-binding assays and molecular docking, shikonin and its derivatives bound and inhibited c-MYC. Furthermore, the deregulation of ERK, JNK MAPK and AKT activity was closely associated with the reduction of c-MYC, indicating the involvement of these signaling molecules in shikonin-triggered c-MYC inactivation. In conclusion, the inhibition of EGFR signaling, synergism with erlotinib and targeting of c-MYC illustrate the multi-targeted feature of natural naphthoquinones such as shikonin and derivatives. This may open attractive possibilities for their use in a molecular targeted cancer therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Up to 80% of patients with severe posttraumatic stress disorder are suffering from "unexplained" chronic pain. Theories about the links between traumatization and chronic pain have become the subject of increased interest over the last several years. We will give a short summary about the existing interaction models that emphasize particularly psychological and behavioral aspects of this interaction. After a synopsis of the most important psychoneurobiological mechanisms of pain in the context of traumatization, we introduce the hypermnesia-hyperarousal model, which focuses on two psychoneurobiological aspects of the physiology of learning. This hypothesis provides an answer to the hitherto open question about the origin of pain persistence and pain sensitization following a traumatic event and also provides a straightforward explanatory model for educational purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The finite element analysis is an accepted method to predict vertebral body compressive strength. This study compares measurements obtained from in vitro tests with the ones from two different simulation models: clinical quantitative computer tomography (QCT) based homogenized finite element (hFE) models and pre-clinical high-resolution peripheral QCT-based (HR-pQCT) hFE models. About 37 vertebral body sections were prepared by removing end-plates and posterior elements, scanned with QCT (390/450μm voxel size) as well as HR-pQCT (82μm voxel size), and tested in compression up to failure. Non-linear viscous damage hFE models were created from QCT/HT-pQCT images and compared to experimental results based on stiffness and ultimate load. As expected, the predictability of QCT/HR-pQCT-based hFE models for both apparent stiffness (r2=0.685/0.801r2=0.685/0.801) and strength (r2=0.774/0.924r2=0.774/0.924) increased if a better image resolution was used. An analysis of the damage distribution showed similar damage locations for all cases. In conclusion, HR-pQCT-based hFE models increased the predictability considerably and do not need any tuning of input parameters. In contrast, QCT-based hFE models usually need some tuning but are clinically the only possible choice at the moment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer tomography (CT)-based finite element (FE) models assess vertebral strength better than dual energy X-ray absorptiometry. Osteoporotic vertebrae are usually loaded via degenerated intervertebral discs (IVD) and potentially at higher risk under forward bending, but the influences of the IVD and loading conditions are generally overlooked. Accordingly, magnetic resonance imaging was performed on 14 lumbar discs to generate FE models for the healthiest and most degenerated specimens. Compression, torsion, bending, flexion and extension conducted experimentally were used to calibrate both models. They were combined with CT-based FE models of 12 lumbar vertebral bodies to evaluate the effect of disc degeneration compared to a loading via endplates embedded in a stiff resin, the usual experimental paradigm. Compression and lifting were simulated, load and damage pattern were evaluated at failure. Adding flexion to the compression (lifting) and higher disc degeneration reduces the failure load (8–14%, 5–7%) and increases damage in the vertebrae. Under both loading scenarios, decreasing the disc height slightly increases the failure load; embedding and degenerated IVD provides respectively the highest and lowest failure load. Embedded vertebrae are more brittle, but failure loads induced via IVDs correlate highly with vertebral strength. In conclusion, osteoporotic vertebrae with degenerated IVDs are consistently weaker—especially under lifting, but clinical assessment of their strength is possible via FE analysis without extensive disc modelling, by extrapolating measures from the embedded situation.