878 resultados para Unified User Experience Model
Resumo:
CONCLUSION: Our self-developed planning and navigation system has proven its capacity for accurate surgery on the anterior and lateral skull base. With the incorporation of augmented reality, image-guided surgery will evolve into 'information-guided surgery'. OBJECTIVE: Microscopic or endoscopic skull base surgery is technically demanding and its outcome has a great impact on a patient's quality of life. The goal of the project was aimed at developing and evaluating enabling navigation surgery tools for simulation, planning, training, education, and performance. This clinically applied technological research was complemented by a series of patients (n=406) who were treated by anterior and lateral skull base procedures between 1997 and 2006. MATERIALS AND METHODS: Optical tracking technology was used for positional sensing of instruments. A newly designed dynamic reference base with specific registration techniques using fine needle pointer or ultrasound enables the surgeon to work with a target error of < 1 mm. An automatic registration assessment method, which provides the user with a color-coded fused representation of CT and MR images, indicates to the surgeon the location and extent of registration (in)accuracy. Integration of a small tracker camera mounted directly on the microscope permits an advantageous ergonomic way of working in the operating room. Additionally, guidance information (augmented reality) from multimodal datasets (CT, MRI, angiography) can be overlaid directly onto the surgical microscope view. The virtual simulator as a training tool in endonasal and otological skull base surgery provides an understanding of the anatomy as well as preoperative practice using real patient data. RESULTS: Using our navigation system, no major complications occurred in spite of the fact that the series included difficult skull base procedures. An improved quality in the surgical outcome was identified compared with our control group without navigation and compared with the literature. The surgical time consumption was reduced and more minimally invasive approaches were possible. According to the participants' questionnaires, the educational effect of the virtual simulator in our residency program received a high ranking.
Resumo:
Drug-induced respiratory depression is a common side effect of the agents used in anesthesia practice to provide analgesia and sedation. Depression of the ventilatory drive in the spontaneously breathing patient can lead to severe cardiorespiratory events and it is considered a primary cause of morbidity. Reliable predictions of respiratory inhibition in the clinical setting would therefore provide a valuable means to improve the safety of drug delivery. Although multiple studies investigated the regulation of breathing in man both in the presence and absence of ventilatory depressant drugs, a unified description of respiratory pharmacodynamics is not available. This study proposes a mathematical model of human metabolism and cardiorespiratory regulation integrating several isolated physiological and pharmacological aspects of acute drug-induced ventilatory depression into a single theoretical framework. The description of respiratory regulation has a parsimonious yet comprehensive structure with substantial predictive capability. Simulations relative to the synergistic interaction of the hypercarbic and hypoxic respiratory drive and the global effect of drugs on the control of breathing are in good agreement with published experimental data. Besides providing clinically relevant predictions of respiratory depression, the model can also serve as a test bed to investigate issues of drug tolerability and dose finding/control under non-steady-state conditions.
Resumo:
BACKGROUND: The sensory drive hypothesis predicts that divergent sensory adaptation in different habitats may lead to premating isolation upon secondary contact of populations. Speciation by sensory drive has traditionally been treated as a special case of speciation as a byproduct of adaptation to divergent environments in geographically isolated populations. However, if habitats are heterogeneous, local adaptation in the sensory systems may cause the emergence of reproductively isolated species from a single unstructured population. In polychromatic fishes, visual sensitivity might become adapted to local ambient light regimes and the sensitivity might influence female preferences for male nuptial color. In this paper, we investigate the possibility of speciation by sensory drive as a byproduct of divergent visual adaptation within a single initially unstructured population. We use models based on explicit genetic mechanisms for color vision and nuptial coloration. RESULTS: We show that in simulations in which the adaptive evolution of visual pigments and color perception are explicitly modeled, sensory drive can promote speciation along a short selection gradient within a continuous habitat and population. We assumed that color perception evolves to adapt to the modal light environment that individuals experience and that females prefer to mate with males whose nuptial color they are most sensitive to. In our simulations color perception depends on the absorption spectra of an individual's visual pigments. Speciation occurred most frequently when the steepness of the environmental light gradient was intermediate and dispersal distance of offspring was relatively small. In addition, our results predict that mutations that cause large shifts in the wavelength of peak absorption promote speciation, whereas we did not observe speciation when peak absorption evolved by stepwise mutations with small effect. CONCLUSION: The results suggest that speciation can occur where environmental gradients create divergent selection on sensory modalities that are used in mate choice. Evidence for such gradients exists from several animal groups, and from freshwater and marine fishes in particular. The probability of speciation in a continuous population under such conditions may then critically depend on the genetic architecture of perceptual adaptation and female mate choice.
Percutaneous autologous venous valve transplantation: short-term feasibility study in an ovine model
Resumo:
BACKGROUND: Limited experience with bioprosthetic venous valve percutaneously inserted into femoral veins in 15 patients has been promising in short-term results only to show disappointing long-term results. Percutaneous autogenous venous valve (PAVV) transplantation was explored in an ovine model as a possible alternative treatment. METHODS: PAVV consisted of a vein segment containing a valve that was attached to a stent template. The stent templates (n = 9) were designed and hand made in our research laboratory. They consist of two stainless steel square stents 13 or 15 mm in diameter to fit the ovine jugular veins (JV), which ranges from 10 to 15 mm in diameter. A valve-containing segment of JV was harvested and attached with sutures and barbs inside the stent template (n = 9). The valve devices were then manually folded and front loaded inside the 4 cm chamber of the 13F delivery sheath and delivered into the contralateral JV by femoral vein approach. Transplanted PAVVs were studied by immediate and 3 months venograms. Animals were euthanized at 3 months, and jugular veins harvested to perform angioscopic evaluations in vitro. RESULTS: PAVV transplantation was successful in all nine animals. Good valve function with no reflux was observed on immediate and 3 months venograms in eight valves. The transplanted maximal JV diameter ranged from 10.2 mm to 15.4 mm (mean 13.1 +/- 1.5 mm). Venoscopic examination revealed intact, flexible, nonthickened valve leaflets in eight specimens. One PAVV exhibited normal function of one leaflet only; the other cusp was accidentally cut during the transplantation procedure. All transplanted autologous valves were free of thrombus and incorporated into the vein wall of the host vessel. CONCLUSION: This study demonstrated that autogenous valve transplants remained patent and competent without long-term anticoagulation for up to 3 months. The percutaneous autogenous venous valve may provide in future minimally invasive treatment for patients with chronic deep venous insufficiency, but long-term studies need to be done to document its continued patency and function.
Resumo:
Comments on an article by Kashima et al. (see record 2007-10111-001). In their target article Kashima and colleagues try to show how a connectionist model conceptualization of the self is best suited to capture the self's temporal and socio-culturally contextualized nature. They propose a new model and to support this model, the authors conduct computer simulations of psychological phenomena whose importance for the self has long been clear, even if not formally modeled, such as imitation, and learning of sequence and narrative. As explicated when we advocated connectionist models as a metaphor for self in Mischel and Morf (2003), we fully endorse the utility of such a metaphor, as these models have some of the processing characteristics necessary for capturing key aspects and functions of a dynamic cognitive-affective self-system. As elaborated in that chapter, we see as their principal strength that connectionist models can take account of multiple simultaneous processes without invoking a single central control. All outputs reflect a distributed pattern of activation across a large number of simple processing units, the nature of which depends on (and changes with) the connection weights between the links and the satisfaction of mutual constraints across these links (Rummelhart & McClelland, 1986). This allows a simple account for why certain input features will at times predominate, while others take over on other occasions. (PsycINFO Database Record (c) 2008 APA, all rights reserved)
Resumo:
New designs of user input systems have resulted from the developing technologies and specialized user demands. Conventional keyboard and mouse input devices still dominate the input speed, but other input mechanisms are demanded in special application scenarios. Touch screen and stylus input methods have been widely adopted by PDAs and smartphones. Reduced keypads are necessary for mobile phones. A new design trend is exploring the design space in applications requiring single-handed input, even with eyes-free on small mobile devices. This requires as few keys on the input device to make it feasible to operate. But representing many characters with fewer keys can make the input ambiguous. Accelerometers embedded in mobile devices provide opportunities to combine device movements with keys for input signal disambiguation. Recent research has explored its design space for text input. In this dissertation an accelerometer assisted single key positioning input system is developed. It utilizes input device tilt directions as input signals and maps their sequences to output characters and functions. A generic positioning model is developed as guidelines for designing positioning input systems. A calculator prototype and a text input prototype on the 4+1 (5 positions) positioning input system and the 8+1 (9 positions) positioning input system are implemented using accelerometer readings on a smartphone. Users use one physical key to operate and feedbacks are audible. Controlled experiments are conducted to evaluate the feasibility, learnability, and design space of the accelerometer assisted single key positioning input system. This research can provide inspiration and innovational references for researchers and practitioners in the positioning user input designs, applications of accelerometer readings, and new development of standard machine readable sign languages.
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
File system security is fundamental to the security of UNIX and Linux systems since in these systems almost everything is in the form of a file. To protect the system files and other sensitive user files from unauthorized accesses, certain security schemes are chosen and used by different organizations in their computer systems. A file system security model provides a formal description of a protection system. Each security model is associated with specified security policies which focus on one or more of the security principles: confidentiality, integrity and availability. The security policy is not only about “who” can access an object, but also about “how” a subject can access an object. To enforce the security policies, each access request is checked against the specified policies to decide whether it is allowed or rejected. The current protection schemes in UNIX/Linux systems focus on the access control. Besides the basic access control scheme of the system itself, which includes permission bits, setuid and seteuid mechanism and the root, there are other protection models, such as Capabilities, Domain Type Enforcement (DTE) and Role-Based Access Control (RBAC), supported and used in certain organizations. These models protect the confidentiality of the data directly. The integrity of the data is protected indirectly by only allowing trusted users to operate on the objects. The access control decisions of these models depend on either the identity of the user or the attributes of the process the user can execute, and the attributes of the objects. Adoption of these sophisticated models has been slow; this is likely due to the enormous complexity of specifying controls over a large file system and the need for system administrators to learn a new paradigm for file protection. We propose a new security model: file system firewall. It is an adoption of the familiar network firewall protection model, used to control the data that flows between networked computers, toward file system protection. This model can support decisions of access control based on any system generated attributes about the access requests, e.g., time of day. The access control decisions are not on one entity, such as the account in traditional discretionary access control or the domain name in DTE. In file system firewall, the access decisions are made upon situations on multiple entities. A situation is programmable with predicates on the attributes of subject, object and the system. File system firewall specifies the appropriate actions on these situations. We implemented the prototype of file system firewall on SUSE Linux. Preliminary results of performance tests on the prototype indicate that the runtime overhead is acceptable. We compared file system firewall with TE in SELinux to show that firewall model can accommodate many other access control models. Finally, we show the ease of use of firewall model. When firewall system is restricted to specified part of the system, all the other resources are not affected. This enables a relatively smooth adoption. This fact and that it is a familiar model to system administrators will facilitate adoption and correct use. The user study we conducted on traditional UNIX access control, SELinux and file system firewall confirmed that. The beginner users found it easier to use and faster to learn then traditional UNIX access control scheme and SELinux.
Resumo:
BACKGROUND: Wheezing disorders in childhood vary widely in clinical presentation and disease course. During the last years, several ways to classify wheezing children into different disease phenotypes have been proposed and are increasingly used for clinical guidance, but validation of these hypothetical entities is difficult. METHODOLOGY/PRINCIPAL FINDINGS: The aim of this study was to develop a testable disease model which reflects the full spectrum of wheezing illness in preschool children. We performed a qualitative study among a panel of 7 experienced clinicians from 4 European countries working in primary, secondary and tertiary paediatric care. In a series of questionnaire surveys and structured discussions, we found a general consensus that preschool wheezing disorders consist of several phenotypes, with a great heterogeneity of specific disease concepts between clinicians. Initially, 24 disease entities were described among the 7 physicians. In structured discussions, these could be narrowed down to three entities which were linked to proposed mechanisms: a) allergic wheeze, b) non-allergic wheeze due to structural airway narrowing and c) non-allergic wheeze due to increased immune response to viral infections. This disease model will serve to create an artificial dataset that allows the validation of data-driven multidimensional methods, such as cluster analysis, which have been proposed for identification of wheezing phenotypes in children. CONCLUSIONS/SIGNIFICANCE: While there appears to be wide agreement among clinicians that wheezing disorders consist of several diseases, there is less agreement regarding their number and nature. A great diversity of disease concepts exist but a unified phenotype classification reflecting underlying disease mechanisms is lacking. We propose a disease model which may help guide future research so that proposed mechanisms are measured at the right time and their role in disease heterogeneity can be studied.
Resumo:
Object-oriented modelling languages such as EMOF are often used to specify domain specific meta-models. However, these modelling languages lack the ability to describe behavior or operational semantics. Several approaches have used a subset of Java mixed with OCL as executable meta-languages. In this experience report we show how we use Smalltalk as an executable meta-language in the context of the Moose reengineering environment. We present how we implemented EMOF and its behavioral aspects. Over the last decade we validated this approach through incrementally building a meta-described reengineering environment. Such an approach bridges the gap between a code-oriented view and a meta-model driven one. It avoids the creation of yet another language and reuses the infrastructure and run-time of the underlying implementation language. It offers an uniform way of letting developers focus on their tasks while at the same time allowing them to meta-describe their domain model. The advantage of our approach is that developers use the same tools and environment they use for their regular tasks. Still the approach is not Smalltalk specific but can be applied to language offering an introspective API such as Ruby, Python, CLOS, Java and C#.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
Can one observe an increasing level of individual lack of orientation because of rapid social change in modern societies? This question is examined using data from a representative longitudinal survey in Germany conducted in 2002–04. The study examines the role of education, age, sex, region (east/west), and political orientation for the explanation of anomia and its development. First we present the different sources of anomie in modern societies, based on the theoretical foundations of Durkheim and Merton, and introduce the different definitions of anomia, including our own cognitive version. Then we deduce several hypotheses from the theory, which we test by means of longitudinal data for the period 2002–04 in Germany using the latent growth curve model as our statistical method. The empirical findings show that all the sociodemographic variables, including political orientation, are strong predictors of the initial level of anomia. Regarding the development of anomia over time (2002–04), only the region (west) has a significant impact. In particular, the results of a multi-group analysis show that western German people with a right-wing political orientation become more anomic over this period. The article concludes with some theoretical implications.
Resumo:
The report examines the relationship between day care institutions, schools and so called “parents unfamiliar to education” as well as the relationship between the institutions. With in Danish public and professional discourse concepts like parents unfamiliar to education are usually referring to environments, parents or families with either no or just very restricted experience of education except for the basic school (folkeskole). The “grand old man” of Danish educational research, Prof. Em. Erik Jørgen Hansen, defines the concept as follows: Parents who are distant from or not familiar with education, are parents without tradition of education and by that fact they are not able to contribute constructively in order to back up their own children during their education. Many teachers and pedagogues are not used to that term; they rather prefer concepts like “socially exposed” or “socially disadvantaged” parents or social classes or strata. The report does not only focus on parents who are not capable to support the school achievements of their children, since a low level of education is usually connected with social disadvantage. Such parents are often not capable of understanding and meeting the demands from side of the school when sending their children to school. They lack the competencies or the necessary competence of action. For the moment being much attention is done from side of the Ministries of Education and Social Affairs (recently renamed Ministry of Welfare) in order to create equal possibilities for all children. Many kinds of expertise (directions, counsels, researchers, etc.) have been more than eager to promote recommendations aiming at achieving the ambitious goal: 2015 95% of all young people should complement a full education (classes 10.-12.). Research results are pointing out the importance of increased participation of parents. In other word the agenda is set for ‘parents’ education’. It seems necessary to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and effects, and by implementing a new wage policy depending on achievements and/or effects a new system of accountability is manufactured. The consequences are already perceptible. The government decides to do some special interventions concerning parents, children or youngsters, the public servants on municipality level are instructed to carry out their services by following a manual, and the parents are no longer protected by privacy. Protection of privacy and minority is no longer a valuable argumentation to prevent further interventions in people’s life (health, food, school, etc.). The citizens are becoming objects of investment, also implying that people are investing in their own health, education, and family. This means that investments in changes of life style and development of competences go hand in hand. The below mentioned programmes are conditioned by this shift.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
This article outlines some of the issues involved in developing partnerships between service users, practitioners and researchers. It discusses these through some experience in Oslo as part of a national level agreement (HUSK) to improve social services in Norway through research and knowledge development. It begins with a review of the main concepts and debates involved in developing collaborative partnerships for practice-based research, particularly in the social services arena. The HUSK program is then described. The article then traces some specific developments and challenges in negotiating partnership relations as discussed by program participants (users, practitioners and researchers) in a series of workshops designed to elicit the issues directly from their experience.