284 resultados para Interface de programas aplicativos (Software)
Resumo:
Stereotypes of salespeople are common currency in US media outlets and research suggests that these stereotypes are uniformly negative. However, there is no reason to expect that stereotypes will be consistent across cultures. The present paper provides the first empirical examination of salesperson stereotypes in an Asian country, specifically Taiwan. Using accepted psychological methods, Taiwanese salesperson stereotypes are found to be twofold, with a negative stereotype being quite congruent with existing US stereotypes, but also a positive stereotype, which may be related to the specific culture of Taiwan.
Resumo:
This paper describes an experiment undertaken to investigate intuitive interaction, particularly in older adults. Previous work has shown that intuitive interaction relies on past experience, and has also suggested that older people demonstrate less intuitive uses and slower times when completing set tasks with various devices. Similarly, this experiment showed that past experience with relevant products allowed people to use the interfaces of two different microwaves more quickly and intuitively. It also revealed that certain aspects of cognitive decline related to aging, such as central executive function, have more impact on time, correct uses and intuitive uses than chronological age. Implications of these results are discussed.
Resumo:
Bronfenbrenner.s Bioecological Model, expressed as the developmental equation, D f PPCT, is the theoretical framework for two studies that bring together diverse strands of psychology to study the work-life interface of working adults. Occupational and organizational psychology is focused on the demands and resources of work and family, without emphasising the individual in detail. Health and personality psychology examine the individual but without emphasis on the individual.s work and family roles. The current research used Bronfenbrenner.s theoretical framework to combine individual differences, work and family to understand how these factors influence the working adult.s psychological functioning. Competent development has been defined as high well-being (measured as life satisfaction and psychological well-being) and high work engagement (as work vigour, work dedication and absorption in work) and as the absence of mental illness (as depression, anxiety and stress) and the absence of burnout (as emotional exhaustion, cynicism and professional efficacy). Study 1 and 2 were linked, with Study 1 as a cross-sectional survey and Study 2, a prospective panel study that followed on from the data used in Study1. Participants were recruited from a university and from a large public hospital to take part in a 3-wave, online study where they completed identical surveys at 3-4 month intervals (N = 470 at Time 1 and N = 198 at Time 3). In Study 1, hierarchical multiple regressions were used to assess the effects of individual differences (Block 1, e.g. dispositional optimism, coping self-efficacy, perceived control of time, humour), work and family variables (Block 2, e.g. affective commitment, skill discretion, work hours, children, marital status, family demands) and the work-life interface (Block 3, e.g. direction and quality of spillover between roles, work-life balance) on the outcomes. There were a mosaic of predictors of the outcomes with a group of seven that were the most frequent significant predictors and which represented the individual (dispositional optimism and coping self-efficacy), the workplace (skill discretion, affective commitment and job autonomy) and the work-life interface (negative work-to-family spillover and negative family-to-work spillover). Interestingly, gender and working hours were not important predictors. The effects of job social support, generally and for work-life issues, perceived control of time and egalitarian gender roles on the outcomes were mediated by negative work-to-family spillover, particularly for emotional exhaustion. Further, the effect of negative spillover on depression, anxiety and work engagement was moderated by the individual.s personal and workplace resources. Study 2 modelled the longitudinal relationships between the group of the seven most frequent predictors and the outcomes. Using a set of non-nested models, the relative influences of concurrent functioning, stability and change over time were assessed. The modelling began with models at Time 1, which formed the basis for confirmatory factor analysis (CFA) to establish the underlying relationships between the variables and calculate the composite variables for the longitudinal models. The CFAs were well fitting with few modifications to ensure good fit. However, using burnout and work engagement together required additional analyses to resolve poor fit, with one factor (representing a continuum from burnout to work engagement) being the only acceptable solution. Five different longitudinal models were investigated as the Well-Being, Mental Distress, Well-Being-Mental Health, Work Engagement and Integrated models using differing combinations of the outcomes. The best fitting model for each was a reciprocal model that was trimmed of trivial paths. The strongest paths were the synchronous correlations and the paths within variables over time. The reciprocal paths were more variable with weak to mild effects. There was evidence of gain and loss spirals between the variables over time, with a slight net gain in resources that may provide the mechanism for the accumulation of psychological advantage over a lifetime. The longitudinal models also showed that there are leverage points at which personal, psychological and managerial interventions can be targeted to bolster the individual and provide supportive workplace conditions that also minimise negative spillover. Bronfenbrenner.s developmental equation has been a useful framework for the current research, showing the importance of the person as central to the individual.s experience of the work-life interface. By taking control of their own life, the individual can craft a life path that is most suited to their own needs. Competent developmental outcomes were most likely where the person was optimistic and had high self-efficacy, worked in a job that they were attached to and which allowed them to use their talents and without too much negative spillover between their work and family domains. In this way, individuals had greater well-being, better mental health and greater work engagement at any one time and across time.
Resumo:
This conference celebrates the passing of 40 years since the establishment of the Internet (dating this, presumably, to the first connection between two nodes on ARPANET in October 1969). For a gathering of media scholars such as this, however, it may be just as important not only to mark the first testing of the core technologies upon which much of our present‐day Net continues to build, but also to reflect on another recent milestone: the 20th anniversary of what is today arguably the chief interface through which billions around the world access and experience the Internet – the World Wide Web, launched by Tim Berners‐Lee in 1989.
Resumo:
Agile ridesharing aims to utilise the capability of social networks and mobile phones to facilitate people to share vehicles and travel in real time. However the application of social networking technologies in local communities to address issues of personal transport faces significant design challenges. In this paper we describe an iterative design-based approach to exploring this problem and discuss findings from the use of an early prototype. The findings focus upon interaction, privacy and profiling. Our early results suggest that explicitly entering information such as ride data and personal profile data into formal fields for explicit computation of matches, as is done in many systems, may not be the best strategy. It might be preferable to support informal communication and negotiation with text search techniques.
Resumo:
This paper investigates how software designers use their knowledge during the design process. The research is based on the analysis of the observational and verbal data from three software design teams generated during the conceptual stage of the design process. The knowledge captured from the analysis of the mapped design team data is utilized to generate descriptive models of novice and expert designers. These models contribute to a better understanding of the connections between, and integration of, designer variables, and to a better understanding of software design expertise and its development. The models are transferable to other domains.
Resumo:
"This column is distinguished from previous Impact columns in that it concerns the development tightrope between research and commercial take-up and the role of the LGPL in an open source workflow toolkit produced in a University environment. Many ubiquitous systems have followed this route, (Apache, BSD Unix, ...), and the lessons this Service Oriented Architecture produces cast yet more light on how software diffuses out to impact us all." Michiel van Genuchten and Les Hatton Workflow management systems support the design, execution and analysis of business processes. A workflow management system needs to guarantee that work is conducted at the right time, by the right person or software application, through the execution of a workflow process model. Traditionally, there has been a lack of broad support for a workflow modeling standard. Standardization efforts proposed by the Workflow Management Coalition in the late nineties suffered from limited support for routing constructs. In fact, as later demonstrated by the Workflow Patterns Initiative (www.workflowpatterns.com), a much wider range of constructs is required when modeling realistic workflows in practice. YAWL (Yet Another Workflow Language) is a workflow language that was developed to show that comprehensive support for the workflow patterns is achievable. Soon after its inception in 2002, a prototype system was built to demonstrate that it was possible to have a system support such a complex language. From that initial prototype, YAWL has grown into a fully-fledged, open source workflow management system and support environment
Resumo:
We have used a scanning tunneling microscope to manipulate heteroleptic phthalocyaninato, naphthalocyaninato, porphyrinato double-decker molecules at the liquid/solid interface between 1-phenyloctane solvent and graphite. We employed nano-grafting of phthalocyanines with eight octyl chains to place these molecules into a matrix of heteroleptic double-decker molecules; the overlayer structure is epitaxial on graphite. We have also used nano-grafting to place double-decker molecules in matrices of single-layer phthalocyanines with octyl chains. Rectangular scans with a scanning tunneling microscope at low bias voltage resulted in the removal of the adsorbed doubledecker molecular layer and substituted the double-decker molecules with bilayer-stacked phthalocyanines from phenyloctane solution. Single heteroleptic double-decker molecules with lutetium sandwiched between naphthalocyanine and octaethylporphyrin were decomposed with voltage pulses from the probe tip; the top octaethylporphyrin ligand was removed and the bottom naphthalocyanine ligand remained on the surface. A domain of decomposed molecules was formed within the double-decker molecular domain, and the boundary of the decomposed molecular domain self-cured to become rectangular. We demonstrated a molecular “sliding block puzzle” with cascades of double-decker molecules on the graphite surface.
Resumo:
This full day workshop invites participants to consider the nexus where the interests of game design, the expectations of play and HCI meet: the game interface. Game interfaces seem different to the interface to other software and there have been a number of observations. Shneiderman famously noticed that while most software designers are intent on following the tenets of the “invisible computer” and making access easy for the user, games inter-faces are made for players: they embed challenge. Schell discusses a “strange” relationship between the player and the game enabled by the interface and user interface designers frequently opine that much can be learned from the design of game interfaces. So where does the game interface actually sit? Even more interesting is the question as to whether the history of the relationship and sub-sequent expectations are now limiting the potential of game design as an expressive form. Recent innovations in I/O design such as Nintendo’s Wii, Sony’s Move and Microsoft's Kinect seem to usher in an age of physical player-enabled interaction, experience and embodied, engaged design. This workshop intends to cast light on this often mentioned and sporadically examined area and to establish a platform for new and innovative design in the field.
Resumo:
The self-assembling behavior and microscopic structure of zinc oxide nanoparticle Langmuir-Blodgett monolayer films were investigated for the case of zinc oxide nanoparticles coated with a hydrophobic layer of dodecanethiol. Evolution of nanoparticle film structure as a function of surface pressure (π) at the air-water interface was monitored in situ using Brewster’s angle microscopy, where it was determined that π=16 mN/m produced near-defect-free monolayer films. Transmission electron micrographs of drop-cast and Langmuir-Schaefer deposited films of the dodecanethiol-coated zinc oxide nanoparticles revealed that the nanoparticle preparation method yielded a microscopic structure that consisted of one-dimensional rodlike assemblies of nanoparticles with typical dimensions of 25 x 400 nm, encased in the organic dodecanethiol layer. These nanoparticle-containing rodlike micelles were aligned into ordered arrangements of parallel rods using the Langmuir-Blodgett technique.
Resumo:
A Simulink Matlab control system of a heavy vehicle suspension has been developed. The aim of the exercise presented in this paper was to develop a Simulink Matlab control system of a heavy vehicle suspension. The objective facilitated by this outcome was the use of a working model of a heavy vehicle (HV) suspension that could be used for future research. A working computer model is easier and cheaper to re-configure than a HV axle group installed on a truck; it presents less risk should something go wrong and allows more scope for variation and sensitivity analysis before embarking on further "real-world" testing. Empirical data recorded as the input and output signals of a heavy vehicle (HV) suspension were used to develop the parameters for computer simulation of a linear time invariant system described by a second-order differential equation of the form: (i.e. a "2nd-order" system). Using the empirical data as an input to the computer model allowed validation of its output compared with the empirical data. The errors ranged from less than 1% to approximately 3% for any parameter, when comparing like-for-like inputs and outputs. The model is presented along with the results of the validation. This model will be used in future research in the QUT/Main Roads project Heavy vehicle suspensions – testing and analysis, particularly so for a theoretical model of a multi-axle HV suspension with varying values of dynamic load sharing. Allowance will need to be made for the errors noted when using the computer models in this future work.
Resumo:
A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the chair test.
Resumo:
This paper considers the problem of building a software architecture for a human-robot team. The objective of the team is to build a multi-attribute map of the world by performing information fusion. A decentralized approach to information fusion is adopted to achieve the system properties of scalability and survivability. Decentralization imposes constraints on the design of the architecture and its implementation. We show how a Component-Based Software Engineering approach can address these constraints. The architecture is implemented using Orca – a component-based software framework for robotic systems. Experimental results from a deployed system comprised of an unmanned air vehicle, a ground vehicle, and two human operators are presented. A section on the lessons learned is included which may be applicable to other distributed systems with complex algorithms. We also compare Orca to the Player software framework in the context of distributed systems.
Resumo:
A significant proportion of the cost of software development is due to software testing and maintenance. This is in part the result of the inevitable imperfections due to human error, lack of quality during the design and coding of software, and the increasing need to reduce faults to improve customer satisfaction in a competitive marketplace. Given the cost and importance of removing errors improvements in fault detection and removal can be of significant benefit. The earlier in the development process faults can be found, the less it costs to correct them and the less likely other faults are to develop. This research aims to make the testing process more efficient and effective by identifying those software modules most likely to contain faults, allowing testing efforts to be carefully targeted. This is done with the use of machine learning algorithms which use examples of fault prone and not fault prone modules to develop predictive models of quality. In order to learn the numerical mapping between module and classification, a module is represented in terms of software metrics. A difficulty in this sort of problem is sourcing software engineering data of adequate quality. In this work, data is obtained from two sources, the NASA Metrics Data Program, and the open source Eclipse project. Feature selection before learning is applied, and in this area a number of different feature selection methods are applied to find which work best. Two machine learning algorithms are applied to the data - Naive Bayes and the Support Vector Machine - and predictive results are compared to those of previous efforts and found to be superior on selected data sets and comparable on others. In addition, a new classification method is proposed, Rank Sum, in which a ranking abstraction is laid over bin densities for each class, and a classification is determined based on the sum of ranks over features. A novel extension of this method is also described based on an observed polarising of points by class when rank sum is applied to training data to convert it into 2D rank sum space. SVM is applied to this transformed data to produce models the parameters of which can be set according to trade-off curves to obtain a particular performance trade-off.