975 resultados para COMMON MARKETS
Resumo:
Background Glutamate is the principal excitatory neurotransmitter in the central nervous system which acts by the activation of either ionotropic (AMPA, NMDA and kainate receptors) or G-protein coupled metabotropic receptors. Glutamate is widely accepted to play a major role in the path physiology of migraine as implicated by data from animal and human studies. Genes involved in synthesis, metabolism and regulation of both glutamate and its receptors could be, therefore, considered as potential candidates for causing/predisposing to migraine when mutated. Methods The association of polymorphic variants of GRIA1-GRIA4 genes which encode for the four subunits (GluR1-GluR4) of the alpha-amino-3- hydroxy-5-methyl-4-isoxazole-propionic acid (AMPA) receptor for glutamate was tested in migraineurs with and without aura (MA and MO) and healthy controls. Results Two variants in the regulative regions of GRIA1 (rs2195450) and GRIA3 (rs3761555) genes resulted strongly associated with MA (P = 0.00002 and P = 0.0001, respectively), but not associated with MO, suggesting their role in cortical spreading depression. Whereas the rs548294 variant in GRIA1 gene showed association primarily with MO phenotype, supporting the hypothesis that MA and MO phenotypes could be genetically related. These variants modify binding sites for transcription factors altering the expression of GRIA1 and GRIA3 genes in different conditions. Conclusions This study represents the first genetic evidence of a link between glutamate receptors and migraine.
A genome-wide scan provides evidence for loci influencing a severe heritable form of common migraine
Resumo:
Migraine is a prevalent neurovascular disease with a significant genetic component. Linkage studies have so far identified migraine susceptibility loci on chromosomes 1, 4, 6, 11, 14, 19 and X. We performed a genome-wide scan of 92 Australian pedigrees phenotyped for migraine with and without aura and for a more heritable form of “severe” migraine. Multipoint non-parametric linkage analysis revealed suggestive linkage on chromosome 18p11 for the severe migraine phenotype (LOD*=2.32, P=0.0006) and chromosome 3q (LOD*=2.28, P=0.0006). Excess allele sharing was also observed at multiple different chromosomal regions, some of which overlap with, or are directly adjacent to, previously implicated migraine susceptibility regions. We have provided evidence for two loci involved in severe migraine susceptibility and conclude that dissection of the “migraine” phenotype may be helpful for identifying susceptibility genes that influence the more heritable clinical (symptom) profiles in affected pedigrees. Also, we concluded that the genetic aetiology of the common (International Headache Society) forms of the disease is probably comprised of a number of low to moderate effect susceptibility genes, perhaps acting synergistically, and this effect is not easily detected by traditional single-locus linkage analyses of large samples of affected pedigrees.
Resumo:
Credence goods markets suffer from inefficiencies caused by superior information of sellers about the surplus-maximizing quality. While standard theory predicts that equal mark-up prices solve the credence goods problem if customers can verify the quality received, experimental evidence indicates the opposite. We identify a lack of robustness of institutional design with respect to heterogeneity in distributional preferences as a possible cause and design new experiments that allow for parsimonious identification of sellers’ distributional types. Our results indicate that less than a fourth of the subjects behave according to standard theory’s assumption, the rest behaving either in line with non-standard selfish or in accordance with non-trivial other-regarding preferences. We discuss consequences of our findings for institutional design and agent selection.
Resumo:
Background & Aims: Peroxisome proliferator-activated receptor (PPAR) γ is a transcription factor, highly expressed in colonic epithelial cells, adipose tissue and macrophages, with an important role in the regulation of inflammatory pathways. The common PPARγ variants C161T and Pro12Ala have recently been associated with Ulcerative Colitis (UC) and an extensive UC phenotype respectively, in a Chinese population. PPARγ Pro12Ala variant homozygotes appear to be protected from the development of Crohn's disease (CD) in European Caucasians. Methods: A case-control study was performed for both variants (CD n=575, UC n=306, Controls n=360) using a polymerase chain reaction (PCR)-restriction fragment length polymorphism analysis in an Australian IBD cohort. A transmission disequilibrium test was also performed using CD trios for the PPARγ C161T variant. Genotype-phenotype analyses were also undertaken. Results: There was no significant difference in genotype distribution data or allele frequency between CD and UC patients and controls. There was no difference in allele transmission for the C161T variant. No significant relationship between the variants and disease location was observed. Conclusions: We were unable to replicate in a Caucasian cohort the recent association between PPARγ C161T and UC or between PPARγ Pro12Ala and an extensive UC phenotype in a Chinese population. There are significant ethnic differences in genetic susceptibility to IBD and its phenotypic expression.
Resumo:
Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.
Resumo:
This study resulted in the development of a decision making tool for engineering consultancies looking to diversify into new markets. It reviewed existing decision tools used by contractor's entering new markets to develop a bespoke tool for engineering consultants to establish more rigor around the decision making process rather than rely purely on the intuition of company executives. The tool can be used for developing medium and long term company strategies or as a quick and efficient way to assess the viability of new market opportunities when they arise. A combination of Delphi and Analytical Hierarchy Process was selected as the basis of the decision theory.
Resumo:
A key question in neuroscience is how memory is selectively allocated to neural networks in the brain. This question remains a significant research challenge, in both rodent models and humans alike, because of the inherent difficulty in tracking and deciphering large, highly dimensional neuronal ensembles that support memory (i.e., the engram). In a previous study we showed that consolidation of a new fear memory is allocated to a common topography of amygdala neurons. When a consolidated memory is retrieved, it may enter a labile state, requiring reconsolidation for it to persist. What is not known is whether the original spatial allocation of a consolidated memory changes during reconsolidation. Knowledge about the spatial allocation of a memory, during consolidation and reconsolidation, provides fundamental insight into its core physical structure (i.e., the engram). Using design-based stereology, we operationally define reconsolidation by showing a nearly identical quantity of neurons in the dorsolateral amygdala (LAd) that expressed a plasticity-related protein, phosphorylated mitogen-activated protein kinase, following both memory acquisition and retrieval. Next, we confirm that Pavlovian fear conditioning recruits a stable, topographically organized population of activated neurons in the LAd. When the stored fear memory was briefly reactivated in the presence of the relevant conditioned stimulus, a similar topography of activated neurons was uncovered. In addition, we found evidence for activated neurons allocated to new regions of the LAd. These findings provide the first insight into the spatial allocation of a fear engram in the LAd, during its consolidation and reconsolidation phase.
Resumo:
Understanding the physical encoding of a memory (the engram) is a fundamental question in neuroscience. Although it has been established that the lateral amygdala is a key site for encoding associative fear memory, it is currently unclear whether the spatial distribution of neurons encoding a given memory is random or stable. Here we used spatial principal components analysis to quantify the topography of activated neurons, in a select region of the lateral amygdala, from rat brains encoding a Pavlovian conditioned fear memory. Our results demonstrate a stable, spatially patterned organization of amygdala neurons are activated during the formation of a Pavlovian conditioned fear memory. We suggest that this stable neuronal assembly constitutes a spatial dimension of the engram. © 2011 This is an open-access article distributed under the terms of the Creative Commons Public Domain declaration which stipulates that, once placed in the public domain, this work may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose.
Resumo:
"The financial system is a key influencer of the health and efficiency of an economy. The role of the financial system is to gather money from people and businesses that currently have more money than they need and transfer it to those that can use it for either business or consumer expenditures. This flow of funds through financial markets and institutions in the Australian economy is huge (in the billions of dollars), affecting business profits, the rate of inflation, interest rates and the production of goods and services. In general, the larger the flow of funds and the more efficient the financial system, the greater the economic output and welfare in the economy. It is not possible to have a modern, complex economy such as that in Australia, without an efficient and sound financial system. The global financial crisis (GFC) of late 2007–09 (and the ensuing European debt crisis), where the global financial market was on the brink of collapse with only significant government intervention stopping a catastrophic global failure of the market, illustrated the importance of the financial system. Financial Markets, Institutions and Money 3rd edition introduces students to the financial system, its operations, and participants. The text offers a fresh, succinct analysis of the financial markets and discusses how the many participants in the financial system interrelate. This includes coverage of regulators, regulations and the role of the Reserve Bank of Australia, that ensure the system’s smooth running, which is essential to a modern economy. The text has been significantly revised to take into account changes in the financial world."---publisher website Table of Contents 1. The financial system - an overview 2. The Monetary Authorities 3. The Reserve Bank of Australia and interest rates 4. The level of interest rates 5. Mathematics of finance 6. Bond Prices and interest rate risk 7. The Structure of Interest Rates 8. Money Markets 9. Bond Markets 10. Equity Markets
Resumo:
The utility of a novel technique for determining the ignition delay in a compression ignition engine has been shown. This method utilises statistical modelling in the Bayesian paradigm to accurately resolve the start of combustion from a band-pass in-cylinder pressure signal. Applied to neat diesel and six biofuels, including four fractionations of palm oil of varying carbon chain length and degree of unsaturation, the relationships between ignition delay, cetane number and oxygen content have been explored. It is noted that the expected negative relationship between ignition delay and cetane number held, as did the positive relationship between ignition delay and oxygen content. The degree of unsaturation was also identified as a potential factor influencing the ignition delay.
Resumo:
The purpose of this scoping paper is to offer an overview of the literature to determine the development to date in the area of residential real estate agency academic and career education in respect to Foreign Direct Investment (FDI) transactions and implications in Australia. This paper will review studies on the issue of foreign real estate ownership and FDI in Australian real estate markets to develop an understanding of the current state of knowledge on residential real estate agency practice, career education and real estate licensing requirements in Australia. The distinction between the real estate profession education, compared to other professions such as accounting, legal and finance is based on the intensity of the professional career training prior or post formal academic training. Real estate education could be carried out with relatively higher standards in terms of licensing requirement, career and academic education. As FDI in the Australian real estate market is a complex globalisation and economic phenomenon, a simple content of residential real estate training and education may not promote proper management or capacity in dealing with relevant foreign residential property market transaction. The preliminary summarising from the literature of residential real estate agency education, with its current relevant or emerging licensing requirement are focused on its role and effectiveness and impact in residential real estate market. Particular focus will be directed to the FDI relevant residential real estate agency transactions and practices, which have been strongly influenced by the current residential real estate market and agency practices. Taken together, there are many opportunities for future research to extend our understanding and improving the residential real estate agency education and training of Foreign Direct Investment in the Australian residential real estate sector.
Resumo:
Abstract BACKGROUND: An examination of melanoma incidence according to anatomical region may be one method of monitoring the impact of public health initiatives. OBJECTIVES: To examine melanoma incidence trends by body site, sex and age at diagnosis or body site and morphology in a population at high risk. MATERIALS AND METHODS: Population-based data on invasive melanoma cases (n = 51473) diagnosed between 1982 and 2008 were extracted from the Queensland Cancer Registry. Age-standardized incidence rates were calculated using the direct method (2000 world standard population) and joinpoint regression models were used to fit trend lines. RESULTS: Significantly decreasing trends for melanomas on the trunk and upper limbs/shoulders were observed during recent years for both sexes under the age of 40 years and among males aged 40-59years. However, in the 60 and over age group, the incidence of melanoma is continuing to increase at all sites (apart from the trunk) for males and on the scalp/neck and upper limbs/shoulders for females. Rates of nodular melanoma are currently decreasing on the trunk and lower limbs. In contrast, superficial spreading melanoma is significantly increasing on the scalp/neck and lower limbs, along with substantial increases in lentigo maligna melanoma since the late 1990s at all sites apart from the lower limbs. CONCLUSIONS: In this large study we have observed significant decreases in rates of invasive melanoma in the younger age groups on less frequently exposed body sites. These results may provide some indirect evidence of the impact of long-running primary prevention campaigns.
Resumo:
Twitter and other social media have become increasingly important tools for maintaining the relationships between fans and their idols across a range of activities, from politics and the arts to celebrity and sports culture. Twitter, Inc. itself has initiated several strategic approaches, especially to entertainment and sporting organisations; late in 2012, for example, a Twitter, Inc. delegation toured Australia in order to develop formal relationships with a number of key sporting bodies covering popular sports such as Australian Rules Football, A-League football (soccer), and V8 touring car racing, as well as to strengthen its connections with key Australian broadcasters and news organisations (Jackson & Christensen, 2012). Similarly, there has been a concerted effort between Twitter Germany and the German Bundesliga clubs and football association to coordinate the presence of German football on Twitter ahead of the 2012–2013 season: the Twitter accounts of almost all first-division teams now bear the official Twitter verification mark, and a system of ‘official’ hashtags for tweeting about individual games (combining the abbreviations of the two teams, e.g. #H96FCB) has also been instituted (Twitter auf Deutsch, 2012).
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
This paper focuses on Australian development firms in the console and mobile games industry in order to understand how small firms in a geographically remote and marginal position in the global industry are able to relate to global firms and capture revenue share. This paper shows that, while technological change in the games industry has resulted in the emergence of new industry segments based on transactional rather than relational forms of economic coordination, in which we might therefore expect less asymmetrical power relations, lead firms retain a position of power in the global games entertainment industry relative to remote developers. This has been possible because lead firms in the emerging mobile devices market have developed and sustained bottlenecks in their segment of the industry through platform competition and the development of an intensely competitive ecosystem of developers. Our research shows the critical role of platform competition and bottlenecks in influencing power asymmetries within global markets.