971 resultados para Ricks, Lawrence
Resumo:
We consider online prediction problems where the loss between the prediction and the outcome is measured by the squared Euclidean distance and its generalization, the squared Mahalanobis distance. We derive the minimax solutions for the case where the prediction and action spaces are the simplex (this setup is sometimes called the Brier game) and the \ell_2 ball (this setup is related to Gaussian density estimation). We show that in both cases the value of each sub-game is a quadratic function of a simple statistic of the state, with coefficients that can be efficiently computed using an explicit recurrence relation. The resulting deterministic minimax strategy and randomized maximin strategy are linear functions of the statistic.
Resumo:
The wind-speed at a site can be measured by installing anemometers on top of meteorological (met) towers. In addition to other sources of error, accelerated airflow, or speed-up, around the top of met towers can cause incorrect anemometer measurements. We consider a particular example where an anemometer was located only 2 tower diameters above the met tower. Using a standard computational fluid-dynamics package, we found the maximum error for this configuration to be 2% of the wind-speed. We conclude that a top-mounted anemometer should be located at the windward side of its met tower, raised 5 diameters above the top. This will reduce speed-up error to less than 1%.
Resumo:
An evolving meditation upon the complex, periodic processes that mark Australia’s seasonality, and our increasing ability to disturb them. By amplifying and shining light upon a myriad of mysterious lives lived in blackness, the work presents a sensuous, deep engagement with the rich, irregular spectras of seasonal forms: whilst hinting at a far less comforting background increasingly framed by anthropogenic climate change. ’Temporal’ uses custom interactive systems, illusionary techniques and real time spatial audio processes that draw upon a rich array of media, including seasonal, nocturnal field recordings sourced in the Bundaberg region and detailed observations of foliage & flowering phases from that region. By drawing inspiration from the subtle transitions between what Europeans once named ‘Summer’ and ‘Autumn’ and the multiple seasons recognised by other cultures, whilst also including bodily disturbances within the work, ’Temporal’ creates a compellingly immersive environment that wraps audiences in luscious yet ominous atmospheres beyond sight and hearing. This work completes a two year long project of dynamic mediated installations that have been presented in Sydney, Beijing, Cairns and Bundanon, that have each been somehow choreographed by environmental cycles; alluding to a new framework for making works that we named ‘Seasonal’. These powerful, responsive & experiential works each draw attention to that which will disappear when biodiverse worlds have descended into an era of permanent darkness – an ‘extinction of human experience’. By tapping into the deeply interlocking seasonal cycles of environments that are themselves intimately linked with social, geographical & political concerns, participating audiences are therefore challenged to see the night, their locality & ecologies in new ways through extending their personal limits of perception, imagery & comprehension.
Resumo:
Black Nectar is a site-specific light & sound installation, that asks audiences to take slow, sensory walks through the inky-blackness of Bundanon’s forests at night, charting personal courses through seasons of change, animality and imagination – far beyond the blinding lights and howling tones of our contemporary existence. Gathering during a time Europeans once named as ‘spring’ audiences will leave the comfy lights and sounds of Bundanon’s homestead area, to take powerful, personal, silent journeys into the long darks of night, heading ultimately towards the place of ‘Black Nectar’. This most unusual of walks begins with impending darkness, and yet ultimately ends with the faintest, sweetest of glimmers – an en-lightening, re-sounding of our seasonal futures?
Resumo:
Background The benefits associated with some cancer treatments do not come without risk. A serious side effect of some common cancer treatments is cardiotoxicity. Increased recognition of the public health implications of cancer treatment-induced cardiotoxicity has resulted in a proliferation of systematic reviews in this field to guide practice. Quality appraisal of these reviews is likely to limit the influence of biased conclusions from systematic reviews that have used poor methodology related to clinical decision-making. The aim of this meta-review is to appraise and synthesise evidence from only high quality systematic reviews focused on the prevention, detection or management of cancer treatment-induced cardiotoxicity. Methods Using Cochrane methodology, we searched databases, citations and hand-searched bibliographies. Two reviewers independently appraised reviews and extracted findings. A total of 18 high quality systematic reviews were subsequently analysed, 67 % (n = 12) of these comprised meta-analyses. Results One systematic review concluded that there is insufficient evidence regarding the utility of cardiac biomarkers for the detection of cardiotoxicity. The following strategies might reduce the risk of cardiotoxicity: 1) The concomitant administration of dexrazoxane with anthracylines; 2) The avoidance of anthracyclines where possible; 3) The continuous administration of anthracyclines (>6 h) rather than bolus dosing; and 4) The administration of anthracycline derivatives such as epirubicin or liposomal-encapsulated doxorubicin instead of doxorubicin. In terms of management, one review focused on medical interventions for treating anthracycline-induced cardiotoxicity during or after treatment of childhood cancer. Neither intervention (enalapril and phosphocreatine) was associated with statistically significant improvement in ejection fraction or mortality. Conclusion This review highlights the lack of high level evidence to guide clinical decision-making with respect to the detection and management of cancer treatment-associated cardiotoxicity. There is more evidence with respect to the prevention of this adverse effect of cancer treatment. This evidence, however, only applies to anthracycline-based chemotherapy in a predominantly adult population. There is no high-level evidence to guide clinical decision-making regarding the prevention, detection or management of radiation-induced cardiotoxicity.
Resumo:
In the United States, there has been fierce debate over state, federal and international efforts to engage in genetically modified food labelling (GM food labelling). A grassroots coalition of consumers, environmentalists, organic farmers, and the food movement has pushed for law reform in respect of GM food labelling. The Just Label It campaign has encouraged United States consumers to send comments to the United States Food and Drug Administration to label genetically modified foods. This Chapter explores the various justifications made in respect of genetically modified food labelling. There has been a considerable effort to portray the issue of GM food labelling as one of consumer rights as part of ‘the right to know’. There has been a significant battle amongst farmers over GM food labelling – with organic farmers and biotechnology companies, fighting for precedence. There has also been a significant discussion about the use of GM food labelling as a form of environmental legislation. The prescriptions in GM food labelling regulations may serve to promote eco-labelling, and deter greenwashing. There has been a significant debate over whether GM food labelling may serve to regulate corporations – particularly from the food, agriculture, and biotechnology industries. There are significant issues about the interaction between intellectual property laws – particularly in respect of trade mark law and consumer protection – and regulatory proposals focused upon biotechnology. There has been a lack of international harmonization in respect of GM food labelling. As such, there has been a major use of comparative arguments about regulator models in respect of food labelling. There has also been a discussion about international law, particularly with the emergence of sweeping regional trade proposals, such as the Trans-Pacific Partnership, and the Trans-Atlantic Trade and Investment Partnership. This Chapter considers the United States debates over genetically modified food labelling – at state, federal, and international levels. The battles often involved the use of citizen-initiated referenda. The policy conflicts have been policy-centric disputes – pitting organic farmers, consumers, and environmentalists against the food industry and biotechnology industry. Such battles have raised questions about consumer rights, public health, freedom of speech, and corporate rights. The disputes highlighted larger issues about lobbying, fund-raising, and political influence. The role of money in United States has been a prominent concern of Lawrence Lessig in his recent academic and policy work with the group, Rootstrikers. Part 1 considers the debate in California over Proposition 37. Part 2 explores other key state initiatives in respect of GM food labelling. Part 3 examines the Federal debate in the United States over GM food labelling. Part 4 explores whether regional trade agreements – such as the Trans-Pacific Partnership (TPP) and the Trans-Atlantic Trade and Investment Partnership (TTIP) – will impact upon
Resumo:
The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.
Resumo:
Engineers and asset managers must often make decisions on how to best allocate limited resources amongst different interrelated activities, including repair, renewal, inspection, and procurement of new assets. The presence of project interdependencies and the lack of sufficient information on the true value of an activity often produce complex problems and leave the decision maker guessing about the quality and robustness of their decision. In this paper, a decision support framework for uncertain interrelated activities is presented. The framework employs a methodology for multi-criteria ranking in the presence of uncertainty, detailing the effect that uncertain valuations may have on the priority of a particular activity. The framework employs employing semi-quantitative risk measures that can be tailored to an organisation and enable a transparent and simple-to-use uncertainty specification by the decision maker. The framework is then demonstrated on a real world project set from a major Australian utility provider.
Resumo:
This research studied the prevalence and impact of workplace cyberbullying as perceived by public servants working in government organisations across Australia. Using Social Information Processing theory, this research found employees reported task- and person-related cyberbullying that was associated with increased workplace stress, diminished job satisfaction and performance, and reduced confidence in their organisations' anti-bullying intervention and protection strategies. Furthermore, workplace cyberbullying can create a concealed, online work culture that undermines employee and organisational productivity. These results are significant for employers' duty-of-care obligations, and represent a cogent argument for improved workplace cultures in support to Australia's future organisational and economic performance.
Resumo:
Background Schizophrenia is associated with lower pre-morbid intelligence (IQ) in addition to (pre-morbid) cognitive decline. Both schizophrenia and IQ are highly heritable traits. Therefore, we hypothesized that genetic variants associated with schizophrenia, including copy number variants (CNVs) and a polygenic schizophrenia (risk) score (PSS), may influence intelligence. Method IQ was estimated with the Wechsler Adult Intelligence Scale (WAIS). CNVs were determined from single nucleotide polymorphism (SNP) data using the QuantiSNP and PennCNV algorithms. For the PSS, odds ratios for genome-wide SNP data were calculated in a sample collected by the Psychiatric Genome-Wide Association Study (GWAS) Consortium (8690 schizophrenia patients and 11 831 controls). These were used to calculate individual PSSs in our independent sample of 350 schizophrenia patients and 322 healthy controls. Results Although significantly more genes were disrupted by deletions in schizophrenia patients compared to controls (p = 0.009), there was no effect of CNV measures on IQ. The PSS was associated with disease status (R 2 = 0.055, p = 2.1 × 10 -7) and with IQ in the entire sample (R 2 = 0.018, p = 0.0008) but the effect on IQ disappeared after correction for disease status. Conclusions Our data suggest that rare and common schizophrenia-associated variants do not explain the variation in IQ in healthy subjects or in schizophrenia patients. Thus, reductions in IQ in schizophrenia patients may be secondary to other processes related to schizophrenia risk. © Cambridge University Press 2013.
Resumo:
We examined the role of common genetic variation in schizophrenia in a genome-wide association study of substantial size: a stage 1 discovery sample of 21,856 individuals of European ancestry and a stage 2 replication sample of 29,839 independent subjects. The combined stage 1 and 2 analysis yielded genome-wide significant associations with schizophrenia for seven loci, five of which are new (1p21.3, 2q32.3, 8p23.2, 8q21.3 and 10q24.32-q24.33) and two of which have been previously implicated (6p21.32-p22.1 and 18q21.2). The strongest new finding (P = 1.6 × 10 -11) was with rs1625579 within an intron of a putative primary transcript for MIR137 (microRNA 137), a known regulator of neuronal development. Four other schizophrenia loci achieving genome-wide significance contain predicted targets of MIR137, suggesting MIR137-mediated dysregulation as a previously unknown etiologic mechanism in schizophrenia. In a joint analysis with a bipolar disorder sample (16,374 affected individuals and 14,044 controls), three loci reached genome-wide significance: CACNA1C (rs4765905, P = 7.0 × 10 -9), ANK3 (rs10994359, P = 2.5 × 10 -8) and the ITIH3-ITIH4 region (rs2239547, P = 7.8 × 10 -9).
Resumo:
Objective. The heritability of RA has been estimated to be ∼55%, of which the MHC contributes about one-third. HLA-DRB1 alleles are strongly associated with RA, but it is likely that significant non-DRB1 MHC genetic susceptibility factors are involved. Previously, we identified two three-marker haplotypes in a 106-kb region in the MHC class III region immediately centromeric to TNF, which are strongly associated with RA on HLA-DRB1*0404 haplotypes. In the present study, we aimed to refine these associations further using a combination of genotyping and gene expression studies. Methods. Thirty-nine nucleotide polymorphisms (SNPs) were genotyped in 95 DRB1*0404 carrying unrelated RA cases, 125 DRB1*0404 - carrying healthy controls and 87 parent-case trio RA families in which the affected child carried HLA-DRB1*04. Quantitative RT-PCR was used to assess the expression of the positional candidate MHC class III genes APOM, BAT2, BAT3, BAT4, BAT5, AIF1, C6orf47, CSNK2β and LY6G5C, and the housekeeper genes, hypoxanthine-guanine phosphoribosyltransferase (HPRT) and β2-microglobulin (B2M) in 31 RA cases and 21 ethnically, age- and sex-matched healthy controls. Synovial membrane specimens from RA, PsA and OA cases were stained by an indirect immunoperoxidase technique using a mouse-anti-human AIF1 monoclonal antibody. Results. Association was observed between RA and single markers or two marker haplotypes involving AIF1, BAT3 and CSNK. AIF1 was also significantly overexpressed in RA mononuclear cells (1.5- to 1.9-fold difference, P = 0.02 vs HPRT, P = 0.002 vs B2M). AIF1 protein was clearly expressed by synovial macrophages in all the inflammatory synovial samples in contrast to the non-inflammatory OA samples. Conclusions. The results of the genotyping and expression studies presented here suggest a role for AIF1 in both the aetiology and pathogenesis of RA.
Resumo:
In recent years there has been considerable interest in developing new types of gelators of organic solvents.1 Despite the recent advances, a priori design of a gelator for gelling a given solvent has remained a challenging task. Various noncovalent interactions like hydrogen-bonding,2 metal coordination3 etc. have been used as the driving force for the gelation process. A special class of cholesterol-based gelators were reported by Weiss,4 and by Shinkai.5 Gels derived from these molecules have been used for chiral recognition/sensing,6 for studying photo- and metal-responsive functions,7 and as templates to make hollow fiber silica.8 Other types of organogels have been used for designing polymerized 9 and reverse aerogels,10 and in molecular imprinting.11 Hanabusa’s group has recently reported organogels with a bile acid derivative.12 This has prompted us to disclose our results on a novel electron donor–acceptor (EDA) interaction mediated two-component13 gelator system based on the bile acid14 backbone.
Resumo:
This thesis is a comparative case study in Japanese video game localization for the video games Sairen, Sairen 2 and Sairen Nyûtoransurêshon, and English-language localized versions of the same games as published in Scandinavia and Australia/New Zealand. All games are developed by Sony Computer Entertainment Inc. and published exclusively for Playstation2 and Playstation3 consoles. The fictional world of the Sairen games draws much influence from Japanese history, as well as from popular and contemporary culture, and in doing so caters mainly to a Japanese audience. For localization, i.e. the adaptation of a product to make it accessible to users outside the original market it was intended for in the first place, this is a challenging issue. Video games are media of entertainment, and therefore localization practice must preserve the games’ effects on the players’ emotions. Further, video games are digital products that are comprised of a multitude of distinct elements, some of which are part of the game world, while others regulate the connection between the player as part of the real world and the game as digital medium. As a result, video game localization is also a practice that has to cope with the technical restrictions that are inherent to the medium. The main theory used throughout the thesis is Anthony Pym’s framework for localization studies that considers the user of the localized product as a defining part of the localization process. This concept presupposes that localization is an adaptation that is performed to make a product better suited for use during a specific reception situation. Pym also addresses the factor that certain products may resist distribution into certain reception situations because of their content, and that certain aspects of localization aim to reduce this resistance through significant alterations of the original product. While Pym developed his ideas with mainly regular software in mind, they can also be adapted well to study video games from a localization angle. Since modern video games are highly complex entities that often switch between interactive and non-interactive modes, Pym’s ideas are adapted throughout the thesis to suit the particular elements being studied. Instances analyzed in this thesis include menu screens, video clips, in-game action and websites. The main research questions focus on how the games’ rules influence localization, and how the games’ fictional domain influences localization. Because there are so many peculiarities inherent to the medium of the video game, other theories are introduced as well to complement the research at hand. These include Lawrence Venuti’s discussions of foreiginizing and domesticating translation methods for literary translation, and Jesper Juul’s definition of games. Additionally, knowledge gathered from interviews with video game localization professionals in Japan during September and October 2009 is also utilized for this study. Apart from answering the aforementioned research questions, one of this thesis’ aims is to enrich the still rather small field of game localization studies, and the study of Japanese video games in particular, one of Japan’s most successful cultural exports.