992 resultados para 48-403
Resumo:
Germline mutations within the cyclin-dependent kinase inhibitor 2A (CDKN2A) gene and one of its targets, the cyclin dependent kinase 4 (CDK4) gene, have been identified in a proportion of melanoma kindreds. In the case of CDK4, only one specific mutation, resulting in the substitution of a cysteine for an arginine at codon 24 (R24C), has been found to be associated with melanoma. We have previously reported the identification of germline CDKN2A mutations in 7/18 Australian melanoma kindreds and the absence of the R24C CDK4 mutation in 21 families lacking evidence of a CDKN2A mutation. The current study represents an expansion of these efforts and includes a total of 48 melanoma families from Australia. All of these families have now been screened for mutations within CDKN2A and CDK4, as well as for mutations within the CDKN2A homolog and 9p21 neighbor, the CDKN2B gene, and the alternative exon 1 (E1beta) of CDKN2A. Families lacking CDKN2A mutations, but positive for a polymorphism(s) within this gene, were further evaluated to determine if their disease was associated with transcriptional silencing of one CDKN2A allele. Overall, CDKN2A mutations were detected in 3/30 (10%) of the new kindreds. Two of these mutations have been observed previously: a 24 bp duplication at the 5' end of the gene and a G to C transversion in exon 2 resulting in an M531 substitution. A novel G to A transition in exon 2, resulting in a D108N substitution was also detected. Combined with our previous findings, we have now detected germline CDKN2A mutations in 10/48 (21%) of our melanoma kindreds. In none of the 'CDKN2A-negative' families was melanoma found to segregate with either an untranscribed CDKN2A allele, an R24C CDK4 mutation, a CDKN2B mutation, or an E1beta mutation. The last three observations suggest that these other cell cycle control genes (or alternative gene products) are either not involved at all, or to any great extent, in melanoma predisposition.
Resumo:
The 48-hour game making challenge started out in 2007 as a creative community event. We have run this event each year since and seen over 120 games made. 2011 was the most remarkable in that each of the 20 teams made a playable game – the shape of the challenge has changed …. We have invested in the process of reflective practice & action research, with the event being part of a sweep of programs that inform this research, with each year giving us fresh insights into both the creative practice and essential concerns, process and trends of the independent games industry creative community, which we then respond to within our curatorial development of the subsequent programming. The 2011 48-hour challenge research project focused on the people and the site. We were specifically interested in the manner in which the community occupied the creative space.
Resumo:
We have always felt that “something very special” was happening in the 48hr and other similar game jams. This “something” is more than the intensity and challenge of the experience, although this certainly has appeal for the participants. We had an intuition that these intense 48 hour game jams exposed something pertinent to the changing shape of the Australian games industry where we see the demise of the late 20th century large studio - the “Night Elf” model and the growth of the small independent model. There are a large number of wider economic and cultural factors around this evolution but our interest is specifically in the change from “industry” to “creative industry” and the growth of games as a cultural media and art practice. If we are correct in our intuition, then illuminating this something also has important ramifications for those courses which teach game and interaction design and development. Rather than undertake a formal ethno-methodological approach, we decided to track as many of the actors in the event as possible. We documented the experience (Keith Novak’s beautiful B&W photography), the individual and their technology (IOGraph mouse tracking), the teams as a group (Time lapse photography) and movement tracking throughout the whole space (Blue tooth phone tracking). The raw data collected has given us opportunity to start a commentary on the “something special” happening in the 48hr.
Resumo:
Persistent digital hyperthermia, presumably due to vasodilation, occurs during the developmental and acute stages of insulin-induced laminitis. The objectives of this study were to determine if persistent digital hyperthermia is the principal pathogenic mechanism responsible for the development of laminitis. The potent vasodilator, ATP-MgCl 2 was infused continuously into the distal phalanx of the left forefoot of six Standardbred racehorses for 48h via intra-osseous infusion to promote persistent digital hyperthermia. The right forefoot was infused with saline solution and acted as an internal control. Clinical signs of lameness at the walk were not detected at 0h, 24h or 48h post-infusion. Mean±SE hoof wall temperatures of the left forefoot (29.4±0.25°C) were higher (P<0.05) than those on the right (27.5±0.38°C). Serum insulin (15.0±2.89μIU/mL) and blood glucose (5.4±0.22mM) concentrations remained unchanged during the experiment. Histopathological evidence of laminitis was not detected in any horse. The results demonstrated that digital vasodilation up to 30 °C for a period of 48. h does not trigger laminitis in the absence of hyperinsulinaemia. Thus, although digital hyperthermia may play a role in the pathogenesis of laminitis, it is not the sole mechanism involved.
Resumo:
Background: Right-to-left shunting via a patent foramen ovale (PFO) has a recognized association with embolic events in younger patients. The use of agitated saline contrast imaging (ASCi) for detecting atrial shunting is well documented, however optimal technique is not well described. The purpose of this study is to assess the efficacy and safety of ASCi via TTE for assessment of right-to-left atrial communication in a large cohort of patients. Method: A retrospective review was undertaken of 1162 consecutive transthoracic (TTE) ASCi studies, of which 195 had also undergone clinically indicated transesophageal (TEE) echo. ASCi shunt results were compared with color flow imaging (CFI) and the role of provocative maneuvers (PM) assessed. Results: 403 TTE studies (35%) had paradoxical shunting seen during ASCi. Of these, 48% were positive with PM only. There was strong agreement between TTE ASCi and reported TEE findings (99% sensitivity, 85% specificity), with six false positive and two false negative results. In hindsight, the latter were likely due to suboptimal right atrial opacification, and the former due to transpulmonary shunting. TTE CFI was found to be insensitive (22%) for the detection of a PFO compared with TTE ASCi. Conclusions: TTE ASCi is minimally invasive and highly accurate for the detection of right-to-left atrial communication when PM are used. TTE CFI was found to be insensitive for PFO screening. It is recommended that TTE ASCi should be considered the initial diagnostic tool for the detection of PFO in clinical practice. A dedicated protocol should be followed to ensure adequate agitated saline contrast delivery and performance of provocative maneuvers.
Resumo:
The average structure (CI) of a volcanic plagioclase megacryst with composition Ano, from the Hogarth Ranges, Australia, has been determined using three-dimensional, singlecrystal neutron and X-ray diffraction data. Least squaresr efinements, incorporating anisotropic thermal motion of all atoms and an extinction correction, resulted in weighted R factors (based on intensities) of 0.076 and 0.056, respectively, for the neutron and X-ray data. Very weak e reflections could be detected in long-exposure X-ray and electron diffraction photographs of this crystal, but the refined average structure is believed to be unaffected by the presence of such a weak superstructure. The ratio of the scattering power of Na to that of Ca is different for X ray and neutron radiation, and this radiation-dependence of scattering power has been used to determine the distribution of Na and Ca over a split-atom M site (two sites designated M' and M") in this Ano, plagioclase. Relative peak-height ratios M'/M", revealed in difference Fourier sections calculated from neutron and X-ray data, formed the basis for the cation-distribution analysis. As neutron and X-ray data sets were directly compared in this analysis, it was important that systematic bias between refined neutron and X-ray positional parameters could be demonstrated to be absent. In summary, with an M-site model constrained only by the electron-microprobedetermined bulk composition of the crystal, the following values were obtained for the M-site occupanciesN: ar, : 0.29(7),N ar. : 0.23(7),C ar, : 0.15(4),a nd Car" : 0.33(4). These results indicate that restrictive assumptions about M sites, on which previous plagioclase refinements have been based, are not applicable to this Ano, and possibly not to the entire compositional range. T-site ordering determined by (T-O) bond-length variation-t,o : 0.51(l), trm = t2o = t2m = 0.32(l)-is weak, as might be expectedf rom the volcanic origin of this megacryst.
Resumo:
Game jams provide design researchers with extraordinary opportunity to watch creative teams in action and recent years have seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). Re-presenting the experience in place has become the goal of the data visualisation project that has become the focus of our own curated 48hr game jam. Taking our cue from the work of Tim Ingold on embodied practice, we have now established the 48hr game making challenge as a site for data visualisation research in place.
Resumo:
In contemporary game development circles the ‘game making jam’ has become an important rite of passage and baptism event, an exploration space and a central indie lifestyle affirmation and community event. Game jams have recently become a focus for design researchers interested in the creative process. In this paper we tell the story of an established local game jam and our various documentation and data collection methods. We present the beginnings of the current project, which seeks to map the creative teams and their process in the space of the challenge, and which aims to enable participants to be more than the objects of the data collection. A perceived issue is that typical documentation approaches are ‘about’ the event as opposed to ‘made by’ the participants and are thus both at odds with the spirit of the jam as a phenomenon and do not really access the rich playful potential of participant experience. In the data collection and visualisation projects described here, we focus on using collected data to re-include the participants in telling stories about their experiences of the event as a place-based experience. Our goal is to find a means to encourage production of ‘anecdata’ - data based on individual story telling that is subjective, malleable, and resists collection via formal mechanisms - and to enable mimesis, or active narrating, on the part of the participants. We present a concept design for data as game based on the logic of early medieval maps and we reflect on how we could enable participation in the data collection itself.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2014 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We continued the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
Over the past decades there has been a considerable development in the modeling of car-following (CF) behavior as a result of research undertaken by both traffic engineers and traffic psychologists. While traffic engineers seek to understand the behavior of a traffic stream, traffic psychologists seek to describe the human abilities and errors involved in the driving process. This paper provides a comprehensive review of these two research streams. It is necessary to consider human-factors in {CF} modeling for a more realistic representation of {CF} behavior in complex driving situations (for example, in traffic breakdowns, crash-prone situations, and adverse weather conditions) to improve traffic safety and to better understand widely-reported puzzling traffic flow phenomena, such as capacity drop, stop-and-go oscillations, and traffic hysteresis. While there are some excellent reviews of {CF} models available in the literature, none of these specifically focuses on the human factors in these models. This paper addresses this gap by reviewing the available literature with a specific focus on the latest advances in car-following models from both the engineering and human behavior points of view. In so doing, it analyses the benefits and limitations of various models and highlights future research needs in the area.
Resumo:
In this paper we analyse two variants of SIMON family of light-weight block ciphers against variants of linear cryptanalysis and present the best linear cryptanalytic results on these variants of reduced-round SIMON to date. We propose a time-memory trade-off method that finds differential/linear trails for any permutation allowing low Hamming weight differential/linear trails. Our method combines low Hamming weight trails found by the correlation matrix representing the target permutation with heavy Hamming weight trails found using a Mixed Integer Programming model representing the target differential/linear trail. Our method enables us to find a 17-round linear approximation for SIMON-48 which is the best current linear approximation for SIMON-48. Using only the correlation matrix method, we are able to find a 14-round linear approximation for SIMON-32 which is also the current best linear approximation for SIMON-32. The presented linear approximations allow us to mount a 23-round key recovery attack on SIMON-32 and a 24-round Key recovery attack on SIMON-48/96 which are the current best results on SIMON-32 and SIMON-48. In addition we have an attack on 24 rounds of SIMON-32 with marginal complexity.
Resumo:
A chitooligosaccharide specific lectin (Luffa acutangula agglutinin) has been purified from the exudate of ridge gourd fruits by affinity chromatography on soybean agglutininglycopeptides coupled to Sepharose-6B. The affinity purified lectin was found homogeneous by polyacrylamide gel electrophoresis, in sodium dodecyl sulphate-polyacrylamide gels, by gel filtration on Sephadex G-100 and by sedimentation velocity experiments. The relative molecular weight of this lectin is determined to be 48,000 ± 1,000 by gel chromatography and sedimentation equilibrium experiments. The sedimentation coefficient (S20, w) was obtained to be 4·06 S. The Stokes’ radius of the protein was found to be 2·9 nm by gel filtration. In sodium dodecyl sulphate-polyacrylamide gel electrophoresis the lectin gave a molecular weight of 24,000 in the presence as well as absence of 2-mercaptoethanol. The subunits in this dimeric lectin are therefore held by non-covalent interactions alone. The lectin is not a glycoprotein and circular dichroism spectral studies indicate that this lectin has 31% α-helix and no ß-sheet. The lectin is found to bind specifically to chitooligosaccharides and the affinity of the lectin increases with increasing oligosaccharide chain length as monitored by near ultra-violetcircular dichroism and intrinsic fluorescence titration. The values of ΔG, ΔΗ and ΔS for the binding process showed a pronounced dependence on the size of the oligosaccharide. The values for both ΔΗ and ΔS show a significant increase with increase in the oligosaccharide chain length showing that the binding of higher oligomers is progressively more favoured thermodynamically than chitobiose itself. The thermodynamic data is consistent with an extended binding site in the lectin which accommodates a tetrasaccharide. Based on the thermodynamic data, blue shifts and fluorescence enhancement, spatial orientation of chitooligosaccharides in the combining site of the lectin is assigned.