464 resultados para Textron, inc.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Growth rods are commonly used for the treatment of scoliosis in the immature spine. Many variations have been proposed but breakage of implants is a common problem. Growth rod insertion commonly involves large exposures at initial insertion followed by multiple smaller procedures for lengthening. We present our early experiences using a percutaneous technique of insertion of a new titanium mobile bearing implant (Medtronic Inc). The implant allows some rotatory motion in the middle of the construct thus reducing construct stresses and thus possibly reducing rod breakage risk. Based on this small initial series with 12 months follow-up, percutaneous insertion of growth rods using the new implant is a safe and reliable technique although the infection rate in our sample was of note. This may be related to the titanium wear and inflammation seen in the soft tissues at time of operation and visualised on histology. No implants have required removal due to infection, and all infections were treated with debridement at next lengthening and suppressive antibiotics. Propionibacterium is one of the commonest infections seen with spinal implants and sometimes does not respond to simple antibiotic suppression. The technique allows preservation of the soft tissues until definitive fusion is needed and may lead to a decrease in hospital stay. The implant is low profile and seems to offer advantages over other systems on the market. Further follow up is needed to look at longer term outcomes with this new implant type.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The measurement of Cobb angles from radiographs is routine practice in spinal clinics. The technique relies on the use and availability of specialist equipment such as a goniometer, cobbometer or protractor. The aim of this study was to validate the use of i-Phone (Apple Inc) combined with Tilt Meter Pro software as compared to a protractor in the measurement of Cobb angles. Between November 2008 and December 2008 20 patients were selected at random from the Paediatric Spine Research Groups Database. A power calculation was performed which indicated if n=240 measurements the study had a 96% chance of detecting a 5 degree difference between groups. All patients had idiopathic scoliosis with a range of curve types and severities. The study found the i-Phone combined with Tilt Meter Pro software offers a faster alternative to the traditional method of Cobb angle measurement. The use of i-Phone offers a more convenient way of measuring Cobb angles in the outpatient setting. The intra-observer repeatability of the iPhone is equivalent to the protractor in the measurement of Cobb angles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research explores gestures used in the context of activities in the workplace and in everyday life in order to understand requirements and devise concepts for the design of gestural information applicances. A collaborative method of video interaction analysis devised to suit design explorations, the Video Card Game, was used to capture and analyse how gesture is used in the context of six different domains: the dentist's office; PDA and mobile phone use; the experimental biologist's laboratory; a city ferry service; a video cassette player repair shop; and a factory flowmeter assembly station. Findings are presented in the form of gestural themes, derived from the tradition of qualitative analysis but bearing some similarity to Alexandrian patterns. Implications for the design of gestural devices are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers how the Internet can be used to leverage commercial sponsorships to enhance audience attitudes toward the sponsor. Definitions are offered that distinguish the terms leverage and activation with respect to sponsorship-linked marketing; leveraging encompasses all marketing communications collateral to the sponsorship investment, whereas activation relates to those communications that encourage interaction with the sponsor. Although activation in many instances may be limited to the immediate event-based audience, leveraging sponsorships via sponsors' Web sites enables activation at the mass-media audience level. Results of a Web site navigation experiment demonstrate that activational sponsor Web sites promote more favorable attitudes than do nonactivational Web sites. It is also shown that sponsorsponsee congruence effects generalize to the online environment, and that the effects of sponsorship articulation on audience attitudes are moderated by the commerciality of the explanation for the sponsor-sponsee relationship. Importantly, the study reveals that attitudinal effects associated with variations in leveraging, congruence, and orientation of articulation may be sustained across time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Designers need to develop good observational skills in order to conduct user studies that reveal the subtleties of human interactions and adequately inform design activity. In this paper we describe a game format that we have used in concert with wiki-web technology, to engage our IT and Information Environments students in developing much sharper observational skills. The Video Card Game is a method of video analysis that is suited to design practitioners as well as to researchers. It uses the familiar format of a card game similar to "Happy Families,, to help students develop themes of interactions from watching video clips. Students then post their interaction themes on wiki-web pages, which allows the teaching team and other students to edit and comment on them. We found that the tangible (cards), game, role playing and sharing aspects of this method led to a much larger amount of interaction and discussion between student groups and between students and the teaching team, than we have achieved using our traditional teaching methods, while taking no more time on the part of the teaching staff. The quality of the resulting interaction themes indicates that this method fosters development of observational skills.In the paper we describe the motivations, method and results in full. We also describe the research context in which we collected the videotape data, and how this method relates to state of the art research methods in interaction design for ubiquitous computing technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Designers and artists have integrated recent advances in interactive, tangible and ubiquitous computing technologies to create new forms of interactive environments in the domains of work, recreation, culture and leisure. Many designs of technology systems begin with the workplace in mind, and with function, ease of use, and efficiency high on the list of priorities. [1] These priorities do not fit well with works designed for an interactive art environment, where the aims are many, and where the focus on utility and functionality is to support a playful, ambiguous or even experimental experience for the participants. To evaluate such works requires an integration of art-criticism techniques with more recent Human Computer Interaction (HCI) methods, and an understanding of the different nature of engagement in these environments. This paper begins a process of mapping a set of priorities for amplifying engagement in interactive art installations. I first define the concept of ludic engagement and its usefulness as a lens for both design and evaluation in these settings. I then detail two fieldwork evaluations I conducted within two exhibitions of interactive artworks, and discuss their outcomes and the future directions of this research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To compare the effectiveness of the STRATIFY falls tool with nurses’ clinical judgments in predicting patient falls. Study Design and Setting: A prospective cohort study was conducted among the inpatients of an acute tertiary hospital. Participants were patients over 65 years of age admitted to any hospital unit. Sensitivity, specificity, and positive predictive value (PPV) and negative predictive values (NPV) of the instrument and nurses’ clinical judgments in predicting falls were calculated. Results: Seven hundred and eighty-eight patients were screened and followed up during the study period. The fall prevalence was 9.2%. Of the 335 patients classified as being ‘‘at risk’’ for falling using the STRATIFY tool, 59 (17.6%) did sustain a fall (sensitivity50.82, specificity50.61, PPV50.18, NPV50.97). Nurses judged that 501 patients were at risk of falling and, of these, 60 (12.0%) fell (sensitivity50.84, specificity50.38, PPV50.12, NPV50.96). The STRATIFY tool correctly identified significantly more patients as either fallers or nonfallers than the nurses (P50.027). Conclusion: Considering the poor specificity and high rates of false-positive results for both the STRATIFY tool and nurses’ clinical judgments, we conclude that neither of these approaches are useful for screening of falls in acute hospital settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sequences of two chloroplast photosystem genes, psaA and psbB, together comprising about 3,500 bp, were obtained for all five major groups of extant seed plants and several outgroups among other vascular plants. Strongly supported, but significantly conflicting, phylogenetic signals were obtained in parsimony analyses from partitions of the data into first and second codon positions versus third positions. In the former, both genes agreed on a monophyletic gymnosperms, with Gnetales closely related to certain conifers. In the latter, Gnetales are inferred to be the sister group of all other seed plants, with gymnosperms paraphyletic. None of the data supported the modern ‘‘anthophyte hypothesis,’’ which places Gnetales as the sister group of flowering plants. A series of simulation studies were undertaken to examine the error rate for parsimony inference. Three kinds of errors were examined: random error, systematic bias (both properties of finite data sets), and statistical inconsistency owing to long-branch attraction (an asymptotic property). Parsimony reconstructions were extremely biased for third-position data for psbB. Regardless of the true underlying tree, a tree in which Gnetales are sister to all other seed plants was likely to be reconstructed for these data. None of the combinations of genes or partitions permits the anthophyte tree to be reconstructed with high probability. Simulations of progressively larger data sets indicate the existence of long-branch attraction (statistical inconsistency) for third-position psbB data if either the anthophyte tree or the gymnosperm tree is correct. This is also true for the anthophyte tree using either psaA third positions or psbB first and second positions. A factor contributing to bias and inconsistency is extremely short branches at the base of the seed plant radiation, coupled with extremely high rates in Gnetales and nonseed plant outgroups. M. J. Sanderson,* M. F. Wojciechowski,*† J.-M. Hu,* T. Sher Khan,* and S. G. Brady

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The security of strong designated verifier (SDV) signature schemes has thus far been analyzed only in a two-user setting. We observe that security in a two-user setting does not necessarily imply the same in a multi-user setting for SDV signatures. Moreover, we show that existing security notions do not adequately model the security of SDV signatures even in a two-user setting. We then propose revised notions of security in a multi-user setting and show that no existing scheme satisfies these notions. A new SDV signature scheme is then presented and proven secure under the revised notions in the standard model. For the purpose of constructing the SDV signature scheme, we propose a one-pass key establishment protocol in the standard model, which is of independent interest in itself.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1997, business trend analyst Linda Stone proposed the term "continuous partial attention" to characterise the contemporary experience of wanting to be ‘a live node on the network’. She argued that while it can be a positive and functional behaviour, it also has the potential to be disabling, compromising reflective and creative thought. Subsequent studies have explored the ways in which technology has slowly disrupted the idea and experience of a "centred" and "bounded" self. Studies of ‘Gen Y’ show the ease with which young people accommodate this multiplying of the self as they negotiate their partial friendships and networks of interest with family and work. In teaching and learning circles in tertiary education we talk a lot about problems of student ‘disengagement’. In characterising our challenge this way, are we undermining our potential to understand the tendencies of contemporary learners? This paper begins a consideration of how traditional models, frameworks and practices might oppose these partially engaged but continuously connected and interpersonal "dividuals". What questions does this provoke for learning environments towards harnessing yet counterpointing the crisis students might experience; to recognise but also integrate their multiple selves towards what they aim to become through the process of learning?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The career development literature published in 2008 is summarized and presented thematically: (a) professional issues, (b) career assessment, (c) career development, (d) career theory and concepts, (e) career interventions, (f) advances in technology, (g) employment, (h) international perspectives, and (i) research design and methodology. Traditional and emerging theories and practices are robust and vibrant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 2003, the “ICT Curriculum Integration Performance Measurement Instrument” was developed froman extensive review ofthe contemporary international and Australian research pertaining to the definition and measurement of ICT curriculum integration in classrooms (Proctor, Watson, & Finger, 2003). The 45-item instrument that resulted was based on theories and methodologies identified by the literature review. This paper describes psychometric results from a large-scale evaluation of the instrument subsequently conducted, as recommended by Proctor, Watson, and Finger (2003). The resultant 20-item, two-factor instrument, now called “Learning with ICTs: Measuring ICT Use in the Curriculum,” is both statistically and theoretically robust. This paper should be read in association with the original paper published in Computers in the Schools(Proctor, Watson, & Finger, 2003) that described in detail the theoretical framework underpinning the development of the instrument.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contemporary debates on the role of journalism in society are continuing the tradition of downplaying the role of proactive journalism - generally situated under the catchphrase of the Fourth Estate - in public policy making. This paper puts the case for the retention of a notion of a proactive form of journalism which can be broadly described as "investigative ", because it is important to the public policy process in modern democracies. It argues that critiques that downplay the potential of this form of journalism are flawed and overly deterministic. Finally. it seeks to illustrate how journalists can proactively inquire in ways that are relevant to the lives ofpeople in a range of settings, and that question elite sources in the interests ofthose people.