878 resultados para Unified User Experience Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Some phase space transport properties for a conservative bouncer model are studied. The dynamics of the model is described by using a two-dimensional measure preserving mapping for the variables' velocity and time. The system is characterized by a control parameter epsilon and experiences a transition from integrable (epsilon = 0) to nonintegrable (epsilon not equal 0). For small values of epsilon, the phase space shows a mixed structure where periodic islands, chaotic seas, and invariant tori coexist. As the parameter epsilon increases and reaches a critical value epsilon(c), all invariant tori are destroyed and the chaotic sea spreads over the phase space, leading the particle to diffuse in velocity and experience Fermi acceleration (unlimited energy growth). During the dynamics the particle can be temporarily trapped near periodic and stable regions. We use the finite time Lyapunov exponent to visualize this effect. The survival probability was used to obtain some of the transport properties in the phase space. For large epsilon, the survival probability decays exponentially when it turns into a slower decay as the control parameter epsilon is reduced. The slower decay is related to trapping dynamics, slowing the Fermi Acceleration, i.e., unbounded growth of the velocity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aircraft composite structures must have high stiffness and strength with low weight, which can guarantee the increase of the pay-load for airplanes without losing airworthiness. However, the mechanical behavior of composite laminates is very complex due the inherent anisotropy and heterogeneity. Many researchers have developed different failure progressive analyses and damage models in order to predict the complex failure mechanisms. This work presents a damage model and progressive failure analysis that requires simple experimental tests and that achieves good accuracy. Firstly, the paper explains damage initiation and propagation criteria and a procedure to identify the material parameters. In the second stage, the model was implemented as a UMAT (User Material Subroutine), which is linked to finite element software, ABAQUS (TM), in order to predict the composite structures behavior. Afterwards, some case studies, mainly off-axis coupons under tensile or compression loads, with different types of stacking sequence were analyzed using the proposed material model. Finally, the computational results were compared to the experimental results, verifying the capability of the damage model in order to predict the composite structure behavior. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The direction of care delivery goes from the action to the being; a process built from professional experience, which gains special characteristics when the service is delivered by telephone. The goal of this research was to understand the interaction between professionals and users in a remote care service; to do so, a research is presented, using Grounded Theory and Symbolic Interactionism as theoretical references. Data were collected through eight interviews with professionals who deliver care by telephone. The theoretical understanding permitted the creation of the theoretical model of the Imaginative Construction of Care, which shows the interaction processes the professional experiences when delivering care by telephone. In this model, individual and social facts are added, showing the link between the concepts, with special emphasis on uncertainty, sensitivity and professional responsibility, as essential components of this experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a performance analysis of a baseband multiple-input single-output ultra-wideband system over scenarios CM1 and CM3 of the IEEE 802.15.3a channel model, incorporating four different schemes of pre-distortion: time reversal, zero-forcing pre-equaliser, constrained least squares pre-equaliser, and minimum mean square error pre-equaliser. For the third case, a simple solution based on the steepest-descent (gradient) algorithm is adopted and compared with theoretical results. The channel estimations at the transmitter are assumed to be truncated and noisy. Results show that the constrained least squares algorithm has a good trade-off between intersymbol interference reduction and signal-to-noise ratio preservation, providing a performance comparable to the minimum mean square error method but with lower computational complexity. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fatigue crack behavior in metals and alloys under constant amplitude test conditions is usually described by relationships between the crack growth rate da/dN and the stress intensity factor range Delta K. In the present work, an enhanced two-parameter exponential equation of fatigue crack growth was introduced in order to describe sub-critical crack propagation behavior of Al 2524-T3 alloy, commonly used in aircraft engineering applications. It was demonstrated that besides adequately correlating the load ratio effects, the exponential model also accounts for the slight deviations from linearity shown by the experimental curves. A comparison with Elber, Kujawski and "Unified Approach" models allowed for verifying the better performance, when confronted to the other tested models, presented by the exponential model. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: to understand the meaning of the childbirth experience for Brazilian primiparas in the postpartum period. Design: a qualitative approach using semi-structured interviews. Content analysis was used to derive the two themes that emerged from the discourses. Setting: participants were recruited at four primary-level health-care units in Ribeirao Preto, Brazil. After providing written informed consent, an appointment was made for an interview at the participants' homes. Participants: 20 primiparas in the postpartum period, aged 15-26 years old, who attended the health-care units to vaccinate their infants and test for phenylketonuria. Findings: two thematic categories emerged from the interviews: the meaning attributed to childbirth (with four subcategories) and perceptions of care. Among the participants, the childbirth experience was marked by the 'fear of death' and 'losing the child'. The pain of giving birth was expected, and the moment of childbirth was associated with pain of high intensity. Key conclusions: childbirth is considered synonymous with physical and emotional suffering, pain, fear and risk of death. Implications for practice: this research indicates the need to break the current mechanistic model of care on which health professionals' actions are based. Care during childbirth must be guided by the foundation that women are the subjects of childbirth actions, in an attempt to emphasise actions that grant them with the autonomy and empowerment needed to experience the situation. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background An important challenge for transcript counting methods such as Serial Analysis of Gene Expression (SAGE), "Digital Northern" or Massively Parallel Signature Sequencing (MPSS), is to carry out statistical analyses that account for the within-class variability, i.e., variability due to the intrinsic biological differences among sampled individuals of the same class, and not only variability due to technical sampling error. Results We introduce a Bayesian model that accounts for the within-class variability by means of mixture distribution. We show that the previously available approaches of aggregation in pools ("pseudo-libraries") and the Beta-Binomial model, are particular cases of the mixture model. We illustrate our method with a brain tumor vs. normal comparison using SAGE data from public databases. We show examples of tags regarded as differentially expressed with high significance if the within-class variability is ignored, but clearly not so significant if one accounts for it. Conclusion Using available information about biological replicates, one can transform a list of candidate transcripts showing differential expression to a more reliable one. Our method is freely available, under GPL/GNU copyleft, through a user friendly web-based on-line tool or as R language scripts at supplemental web-site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Several mathematical and statistical methods have been proposed in the last few years to analyze microarray data. Most of those methods involve complicated formulas, and software implementations that require advanced computer programming skills. Researchers from other areas may experience difficulties when they attempting to use those methods in their research. Here we present an user-friendly toolbox which allows large-scale gene expression analysis to be carried out by biomedical researchers with limited programming skills. Results Here, we introduce an user-friendly toolbox called GEDI (Gene Expression Data Interpreter), an extensible, open-source, and freely-available tool that we believe will be useful to a wide range of laboratories, and to researchers with no background in Mathematics and Computer Science, allowing them to analyze their own data by applying both classical and advanced approaches developed and recently published by Fujita et al. Conclusion GEDI is an integrated user-friendly viewer that combines the state of the art SVR, DVAR and SVAR algorithms, previously developed by us. It facilitates the application of SVR, DVAR and SVAR, further than the mathematical formulas present in the corresponding publications, and allows one to better understand the results by means of available visualizations. Both running the statistical methods and visualizing the results are carried out within the graphical user interface, rendering these algorithms accessible to the broad community of researchers in Molecular Biology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: Liver metastases are a common event in the clinical outcome of patients with colorectal cancer and account for 2/3 of deaths from this disease. There is considerable controversy among the data in the literature regarding the results of surgical treatment and prognostic factors of survival, and no analysis have been done in a large cohort of patients in Brazil. OBJECTIVES: To characterize the results of surgical treatment of patients with colorectal liver metastases, and to establish prognostic factors of survival in a Brazilian population. METHOD: This was a retrospective study of patients undergoing liver resection for colorectal metastases in a tertiary cancer hospital from 1998 to 2009. We analyzed epidemiologic variables and the clinical characteristics of primary tumors, metastatic disease and its treatment, surgical procedures and follow-up, and survival results. Survival analyzes were done by the Kaplan-Meier method and the log-rank test was applied to determine the influence of variables on overall and disease-free survival. All variables associated with survival with P<0.20 in univariate analysis, were included in multivariate analysis using a Cox proportional hazard regression model. RESULTS: During the period analyzed, 209 procedures were performed on 170 patients. Postope-rative mortality in 90 days was 2.9% and 5-year overall survival was 64.9%. Its independent prognostic factors were the presence of extrahepatic disease at diagnosis of liver metastases, bilateral nodules and the occurrence of major complications after liver surgery. The estimated 5-year disease-free survival was 39.1% and its prognostic factors included R1 resection, extrahepatic disease, bilateral nodules, lymph node involvement in the primary tumor and primary tumors located in the rectum. CONCLUSION: Liver resection for colorectal metastases is safe and effective and the analysis of prognostic factors of survival in a large cohort of Brazilian patients showed similar results to those pointed in international series. The occurrence of major postoperative complications appears to be able to compromise overall survival and further investigation in needed in this topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users' workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES] Webcam App es una aplicación que tiene como principal objetivo social que las personas puedan realizar videoconferencias a través de la web de forma gratuita y sencilla. Para el desarrollo de la misma, fueron de gran utilidad los elementos que brinda HTML5.0 para dar soporte multimedia: y . También, se usan dos de las APIs que implementa WebRTC para la trasmisión de audio y video en tiempo real, obtenidos desde la webcam: MediaStream (getUserMedia) y RTCPeerConnection. Para soportar esta aplicación se elige Node.js como servidor web, pues entre sus puntos fuertes está la capacidad de mantener varias conexiones abiertas, característica fundamental en una aplicación de videollamadas, donde miles de usuarios crean y envían solicitudes de conexión simultáneamente. Con el fin de aportarle una apariencia agradable a la aplicación, un entorno usable y conocido para los usuarios, se utiliza CMS Elgg como marco de red social. CMS Elgg provee de funcionalidades comunes, como por ejemplo: conectar con amigos, enviar mensajes, compartir contenido. Como metodología base se usa el Proceso Unificado de Desarrollo de Software, posibilitando que la realización de este trabajo se haya hecho de una manera organizada y se obtuvieran artefactos para el desarrollo. Como resultado del trabajo, se obtiene una solución Open Source que sirve como un modelo de comunicación en tiempo real sin necesidad de descargar, instalar o actualizar ningún complemento de terceros y que demuestra la fiabilidad de los sistemas basados en HTML5 y WebRTC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]A numerical model for the evaluation of solar radiation in different locations is presented. The solar radiation model is implemented taking into account the terrain surface using two-dimensional adaptive meshes of triangles that are constructed using a refinement/derefinement procedure in accordance with the variations of terrain surface and albedo. The selected methodology defines the terrain characteristics with a minimum number of points so that the computational cost is reduced for a given accuracy. The model can be used in atmospheric sciences as well as in other fields such as electrical engineering, since it allows the user to find the optimal location for maximum power generation in photovoltaic or solar thermal power plants...