98 resultados para Bullinger, Heinrich, 1504-1575.
Resumo:
Lung cancer is the most important cause of cancer-related mortality. Resectability and eligibility for treatment with adjuvant chemotherapy is determined by staging according to the TNM classification. Other determinants of tumour behaviour that predict disease outcome, such as molecular markers, may improve decision-making. Activation of the gene encoding human telomerase reverse transcriptase (hTERT) is implicated in the pathogenesis of lung cancer, and consequently detection of hTERT mRNA might have prognostic value for patients with early stage lung cancer. A cohort of patients who underwent a complete resection for early stage lung cancer was recruited as part of the European Early Lung Cancer (EUELC) project. In 166 patients expression of hTERT mRNA was determined in tumour tissue by quantitative real-time RT-PCR and related to that of a house-keeping gene (PBGD). Of a subgroup of 130 patients tumour-distant normal tissue was additionally available for hTERT mRNA analysis. The correlation between hTERT levels of surgical samples and disease-free survival was determined using a Fine and Gray hazard model. Although hTERT mRNA positivity in tumour tissue was significantly associated with clinical stage (Fisher's exact test p=0.016), neither hTERT mRNA detectability nor hTERT mRNA levels in tumour tissue were associated with clinical outcome. Conversely, hTERT positivity in adjacent normal samples was associated with progressive disease, 28% of patients with progressive disease versus 7.5% of disease-free patients had detectable hTERT mRNA in normal tissue [adjusted HR: 3.60 (1.64-7.94), p=0.0015]. hTERT mRNA level in tumour tissue has no prognostic value for patients with early stage lung cancer. However, detection of hTERT mRNA expression in tumour-distant normal lung tissue may indicate an increased risk of progressive disease.
Resumo:
Particles of carrot red leaf virus (CRLV; luteovirus group) purified from chervil (Anthriscus cerefolium) contain a single ssRNA species of mol. wt. about 1.8 x 106 and a major protein of mol. wt. about 25000. CRLV acts as a helper for aphid transmission of carrot mottle virus (CMotV; ungrouped) from mixedly infected plants. Virus preparations purified from such plants possess the infectivity of both viruses but contain particles indistinguishable from those of CRLV; some of the particles are therefore thought to consist of CMotV RNA packaged in CRLV coat protein. When RNA from such preparations was electrophoresed in agarose/polyacrylamide gels, CMotV infectivity was associated with an RNA band that migrated ahead of the CRLV RNA band and had an estimated mol. wt. of about 1.5 x 106, similar to that previously found for the infective ssRNA extracted directly from Nicotiana clevelandii leaves infected with CMotV alone. Preparations of dsRNA from CMotV-infected N. clevelandii leaves contained two species: one of mol. wt. about 3.2 x 106, presumably the replicative form of the infective ssRNA, and the other, mol. wt. about 0.9 x 106, of unknown origin and function. The infective agent in buffer extracts of CMotV-infected N. clevelandii was resistant to RNase (although the enzyme acted as a reversible inhibitor of infection at high concentrations) and is therefore not unprotected RNA. It may be protected within the approximately 52 nm enveloped structures previously reported.
Resumo:
Modern health information systems can generate several exabytes of patient data, the so called "Health Big Data", per year. Many health managers and experts believe that with the data, it is possible to easily discover useful knowledge to improve health policies, increase patient safety and eliminate redundancies and unnecessary costs. The objective of this paper is to discuss the characteristics of Health Big Data as well as the challenges and solutions for health Big Data Analytics (BDA) – the process of extracting knowledge from sets of Health Big Data – and to design and evaluate a pipelined framework for use as a guideline/reference in health BDA.
Resumo:
The 2008 NASA Astrobiology Roadmap provides one way of theorising this developing field, a way which has become the normative model for the discipline: science-and scholarship-driven funding for space. By contrast, a novel re-evaluation of funding policies is undertaken in this article to reframe astrobiology, terraforming and associated space travel and research. Textual visualisation, discourse and numeric analytical methods, and value theory are applied to historical data and contemporary sources to re-investigate significant drivers and constraints on the mechanisms of enabling space exploration. Two data sets are identified and compared: the business objectives and outcomes of major 15th-17th century European joint-stock exploration and trading companies and a case study of a current space industry entrepreneur company. Comparison of these analyses suggests that viable funding policy drivers can exist outside the normative science and scholarship-driven roadmap. The two drivers identified in this study are (1) the intrinsic value of space as a territory to be experienced and enjoyed, not just studied, and (2) the instrumental, commercial value of exploiting these experiences by developing infrastructure and retail revenues. Filtering of these results also offers an investment rationale for companies operating in, or about to enter, the space business marketplace.
Resumo:
Road networks are a national critical infrastructure. The road assets need to be monitored and maintained efficiently as their conditions deteriorate over time. The condition of one of such assets, road pavement, plays a major role in the road network maintenance programmes. Pavement conditions depend upon many factors such as pavement types, traffic and environmental conditions. This paper presents a data analytics case study for assessing the factors affecting the pavement deflection values measured by the traffic speed deflectometer (TSD) device. The analytics process includes acquisition and integration of data from multiple sources, data pre-processing, mining useful information from them and utilising data mining outputs for knowledge deployment. Data mining techniques are able to show how TSD outputs vary in different roads, traffic and environmental conditions. The generated data mining models map the TSD outputs to some classes and define correction factors for each class.
Resumo:
This paper surveys the practical benefits and drawbacks of several identity-based encryption schemes based on bilinear pairings. After providing some background on identity-based cryptography, we classify the known constructions into a handful of general approaches. We then describe efficient and fully secure IBE and IBKEM instantiations of each approach, with reducibility to practice as the main design parameter. Finally, we catalogue the strengths and weaknesses of each construction according to a few theoretical and many applied comparison criteria.
Resumo:
We present two unconditional secure protocols for private set disjointness tests. In order to provide intuition of our protocols, we give a naive example that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the intersection cardinality. More specifically, it discloses its lower bound. By using the Lagrange interpolation, we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. In this protocol, a verification test is applied to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are the first ones that have been designed without a generic secure function evaluation. More important, they are the most efficient protocols for private disjointness tests in the malicious adversary case.
Resumo:
Accurate and detailed measurement of an individual's physical activity is a key requirement for helping researchers understand the relationship between physical activity and health. Accelerometers have become the method of choice for measuring physical activity due to their small size, low cost, convenience and their ability to provide objective information about physical activity. However, interpreting accelerometer data once it has been collected can be challenging. In this work, we applied machine learning algorithms to the task of physical activity recognition from triaxial accelerometer data. We employed a simple but effective approach of dividing the accelerometer data into short non-overlapping windows, converting each window into a feature vector, and treating each feature vector as an i.i.d training instance for a supervised learning algorithm. In addition, we improved on this simple approach with a multi-scale ensemble method that did not need to commit to a single window size and was able to leverage the fact that physical activities produced time series with repetitive patterns and discriminative features for physical activity occurred at different temporal scales.
Resumo:
This paper aims to develop a comprehensive approach to innovate urban policymaking and planning to successfully deliver the knowledge-based agenda. The paper, first, examines the concept of knowledge-based urban development, which has become a popular urban development policy and strategy in recent years, through a comprehensive review of the literature. It, then, introduces and discusses a novel methodological approach for effective policymaking and planning mechanism to deliver the knowledge-based agenda of cities. The paper, with the proposed methodology, brings together urban policymaking and planning approaches, and introduces a novel way to assess knowledge-based urban development achievements and potentials of emerging and prosperous knowledge cities. The paper, thus, provides an invaluable instrument to inform local and regional decision and plan making mechanisms to deliver their knowledge-based agendas and help them in moving towards building their sustainable knowledge cities.
Resumo:
This paper examines the use of Twitter for long-term discussions around Australian politics, at national and state levels, tracking two hashtags during 2012: #auspol, denoting national political topics, and #wapol, which provides a case study of state politics (representing Western Australia). The long-term data collection provides the opportunity to analyse how the Twitter audience responds to Australian politics: which themes attract the most attention and which accounts act as focal points for these discussions. The paper highlights differences in the coverage of state and national politics. For #auspol, a small number of accounts are responsible for the majority of tweets, with politicians invoked but not directly contributing to the discussion. In contrast, #wapol stimulates a much lower level of tweeting. This example also demonstrates that, in addition to citizen accounts, traditional participants within political debate, such as politicians and journalists, are among the active contributors to state-oriented discussions on Twitter.
Resumo:
Security protocols are designed in order to provide security properties (goals). They achieve their goals using cryptographic primitives such as key agreement or hash functions. Security analysis tools are used in order to verify whether a security protocol achieves its goals or not. The analysed property by specific purpose tools are predefined properties such as secrecy (confidentiality), authentication or non-repudiation. There are security goals that are defined by the user in systems with security requirements. Analysis of these properties is possible with general purpose analysis tools such as coloured petri nets (CPN). This research analyses two security properties that are defined in a protocol that is based on trusted platform module (TPM). The analysed protocol is proposed by Delaune to use TPM capabilities and secrets in order to open only one secret from two submitted secrets to a recipient
Resumo:
While data quality has been identified as a critical factor associated with enterprise resource planning (ERP) failure, the relationship between ERP stakeholders, the information they require and its relationship to ERP outcomes continues to be poorly understood. Applying stakeholder theory to the problem of ERP performance, we put forward a framework articulating the fundamental differences in the way users differentiate between ERP data quality and utility. We argue that the failure of ERPs to produce significant organisational outcomes can be attributed to conflict between stakeholder groups over whether the data contained within an ERP is of adequate ‘quality’. The framework provides guidance as how to manage data flows between stakeholders, offering insight into each of their specific data requirements. The framework provides support for the idea that stakeholder affiliation dictates the assumptions and core values held by individuals, driving their data needs and their perceptions of data quality and utility.
Resumo:
This study investigates the price linkage among the US major energy sources, considering structural breaks in time series, to provide information for diversifying the US energy sources. We find that only a weak linkage sustains among crude oil, gasoline, heating oil, coal, natural gas, uranium and ethanol futures prices. This implies that the US major energy source markets are not integrated as one primary energy market. Our tests also reveal that uranium and ethanol futures prices have very weak linkages with other major energy source prices. This indicates that the US energy market is still at a stage where none of the probable alternative energy source markets are playing the role as substitute or complement markets for the fossil fuel energy markets.
Resumo:
China is an emerging and leading world economy. The pace of economic change has been tremendously rapid since the beginning of economic reforms. Despite the importance of the Environmental Kuznets Curve (EKC) and environmental problems in China, no previous study has tested the EKC in China because of the difficulty in obtaining data and the need to adjust the data. The focus of this paper is to test the EKC in China using province level data over the period 1992-2003. This study applies non-parametric techniques to estimate the relationship between income and the environmental quality of wastewater, air pollution and solid waste. Copyright © 2009 Inderscience Enterprises Ltd.
Resumo:
We identify determinants of plant dynamics and find their differences before, during, and after the Asian financial crisis. The results show that the distinction of the crisis is important and the effects of the crisis do not seem to persist after 1998. Furthermore, we reject Gibrat's law as the right functional form to describe plant growth. We are not able to support empirically the theoretical results that smaller and efficient plants tend to grow faster than larger and inefficient plants with the exception of the crisis period. The results reflect that there was a trickle down effect of economic development.