531 resultados para Value Stream Mapping
Resumo:
Using a sample of publicly listed firm in Korea from 2002 to 2006, this article examines the impact of board monitoring on firm value and productivity. We use outsider's attendance of board meetings as a proxy for board monitoring. Consistent with the commitment hypothesis, we find that outsider's attendance rate increases firm value, suggesting that attending board meeting itself is a strong signal that reflects outsider's intention to monitor insiders. While ownership of controlling shareholders negatively affects firm value, this relationship is not moderated by increased monitoring by outsiders. Our findings provide further evidence that the outside director system is less effective in chaebol-affiliated firms. Results also indicate that the effect of outsider's board monitoring activity on investor's valuation of the firm is greater than on productivity improvement of the firm. Our conclusions are robust for possible endogeneity in the relationship between firm value and board attendance by outside directors.
Resumo:
Nonsmall cell lung cancer samples from the European Early Lung Cancer biobank were analysed to assess the prognostic significance of mutations in the TP53, KRAS and EGFR genes. The series included 11 never-smokers, 86 former smokers, 152 current smokers and one patient without informed smoking status. There were 110 squamous cell carcinomas (SCCs), 133 adenocarcinomas (ADCs) and seven large cell carcinomas or mixed histologies. Expression of p53 was analysed by immunohistochemistry. DNA was extracted from frozen tumour tissues. TP53 mutations were detected in 48.8% of cases and were more frequent among SCCs than ADCs (p<0.0001). TP53 mutation status was not associated with prognosis. G to T transversions, known to be associated with smoking, were marginally more common among patients who developed a second primary lung cancer or recurrence/metastasis (progressive disease). EGFR mutations were almost exclusively found in never-smoking females (p=0.0067). KRAS mutations were detected in 18.5% of cases, mainly ADC (p<0.0001), and showed a tendency toward association with progressive disease status. These results suggest that mutations are good markers of different aetiologies and histopathological forms of lung cancers but have little prognostic value, with the exception of KRAS mutation, which may have a prognostic value in ADC. Copyright©ERS 2012.
Resumo:
Lung cancer is the most important cause of cancer-related mortality. Resectability and eligibility for treatment with adjuvant chemotherapy is determined by staging according to the TNM classification. Other determinants of tumour behaviour that predict disease outcome, such as molecular markers, may improve decision-making. Activation of the gene encoding human telomerase reverse transcriptase (hTERT) is implicated in the pathogenesis of lung cancer, and consequently detection of hTERT mRNA might have prognostic value for patients with early stage lung cancer. A cohort of patients who underwent a complete resection for early stage lung cancer was recruited as part of the European Early Lung Cancer (EUELC) project. In 166 patients expression of hTERT mRNA was determined in tumour tissue by quantitative real-time RT-PCR and related to that of a house-keeping gene (PBGD). Of a subgroup of 130 patients tumour-distant normal tissue was additionally available for hTERT mRNA analysis. The correlation between hTERT levels of surgical samples and disease-free survival was determined using a Fine and Gray hazard model. Although hTERT mRNA positivity in tumour tissue was significantly associated with clinical stage (Fisher's exact test p=0.016), neither hTERT mRNA detectability nor hTERT mRNA levels in tumour tissue were associated with clinical outcome. Conversely, hTERT positivity in adjacent normal samples was associated with progressive disease, 28% of patients with progressive disease versus 7.5% of disease-free patients had detectable hTERT mRNA in normal tissue [adjusted HR: 3.60 (1.64-7.94), p=0.0015]. hTERT mRNA level in tumour tissue has no prognostic value for patients with early stage lung cancer. However, detection of hTERT mRNA expression in tumour-distant normal lung tissue may indicate an increased risk of progressive disease.
Resumo:
Forward genetic screens have identified numerous genes involved in development and metabolism, and remain a cornerstone of biological research. However, to locate a causal mutation, the practice of crossing to a polymorphic background to generate a mapping population can be problematic if the mutant phenotype is difficult to recognize in the hybrid F2 progeny, or dependent on parental specific traits. Here in a screen for leaf hyponasty mutants, we have performed a single backcross of an Ethane Methyl Sulphonate (EMS) generated hyponastic mutant to its parent. Whole genome deep sequencing of a bulked homozygous F2 population and analysis via the Next Generation EMS mutation mapping pipeline (NGM) unambiguously determined the causal mutation to be a single nucleotide polymorphisim (SNP) residing in HASTY, a previously characterized gene involved in microRNA biogenesis. We have evaluated the feasibility of this backcross approach using three additional SNP mapping pipelines; SHOREmap, the GATK pipeline, and the samtools pipeline. Although there was variance in the identification of EMS SNPs, all returned the same outcome in clearly identifying the causal mutation in HASTY. The simplicity of performing a single parental backcross and genome sequencing a small pool of segregating mutants has great promise for identifying mutations that may be difficult to map using conventional approaches.
Resumo:
This thesis presents a novel approach to mobile robot navigation using visual information towards the goal of long-term autonomy. A novel concept of a continuous appearance-based trajectory is proposed in order to solve the limitations of previous robot navigation systems, and two new algorithms for mobile robots, CAT-SLAM and CAT-Graph, are presented and evaluated. These algorithms yield performance exceeding state-of-the-art methods on public benchmark datasets and large-scale real-world environments, and will help enable widespread use of mobile robots in everyday applications.
Resumo:
The Smart Fields programme has been active in Shell over the last decade and has given large benefits. In order to understand the value and to underpin strategies for the future implementation programme, a study was carried out to quantify the benefits to date. This focused on actually achieved value, through increased production or lower costs. This provided an estimate of the total value achieved to date. Future benefits such as increased reserves or continued production gain were recorded separately. The paper describes the process followed in the benefits quantification. It identifies the key solutions and technologies and describes the mechanism used to understand the relation between solutions and value. Examples have been given of value from various assets around the world, in both existing fields and in green fields. Finally, the study provided the methodology for tracking of value. This helps Shell to estimate and track the benefits of the Smart Fields programme at company scale.
Resumo:
Significant problems confront our child protection out-of-home care system including: high costs; increasing numbers of children and young people entering and remaining in care longer; high frequency of placement movement; and, negative whole-of-life outcomes for children and young people who have exited care. National policy and research agendas recognise the importance of enhancing the evidence base in out-of-home care to inform the development of policy, programs and practice, and improve longitudinal outcomes of children and young people. The authors discuss the concept of placement trajectory as a framework for research and systems analysis in the out-of-home context. While not without limitations, the concept of placement trajectory is particularly useful in understanding the factors influencing placement movement and stability. Increasing the evidence base in this area can serve to enhance improved outcomes across the lifespan for children and young people in the out-of-home care system.
Resumo:
The ability to measure surface temperature and represent it on a metrically accurate 3D model has proven applications in many areas such as medical imaging, building energy auditing, and search and rescue. A system is proposed that enables this task to be performed with a handheld sensor, and for the first time with results able to be visualized and analyzed in real-time. A device comprising a thermal-infrared camera and range sensor is calibrated geometrically and used for data capture. The device is localized using a combination of ICP and video-based pose estimation from the thermal-infrared video footage which is shown to reduce the occurrence of failure modes. Furthermore, the problem of misregistration which can introduce severe distortions in assigned surface temperatures is avoided through the use of a risk-averse neighborhood weighting mechanism. Results demonstrate that the system is more stable and accurate than previous approaches, and can be used to accurately model complex objects and environments for practical tasks.
Resumo:
Monitoring stream networks through time provides important ecological information. The sampling design problem is to choose locations where measurements are taken so as to maximise information gathered about physicochemical and biological variables on the stream network. This paper uses a pseudo-Bayesian approach, averaging a utility function over a prior distribution, in finding a design which maximizes the average utility. We use models for correlations of observations on the stream network that are based on stream network distances and described by moving average error models. Utility functions used reflect the needs of the experimenter, such as prediction of location values or estimation of parameters. We propose an algorithmic approach to design with the mean utility of a design estimated using Monte Carlo techniques and an exchange algorithm to search for optimal sampling designs. In particular we focus on the problem of finding an optimal design from a set of fixed designs and finding an optimal subset of a given set of sampling locations. As there are many different variables to measure, such as chemical, physical and biological measurements at each location, designs are derived from models based on different types of response variables: continuous, counts and proportions. We apply the methodology to a synthetic example and the Lake Eacham stream network on the Atherton Tablelands in Queensland, Australia. We show that the optimal designs depend very much on the choice of utility function, varying from space filling to clustered designs and mixtures of these, but given the utility function, designs are relatively robust to the type of response variable.
Resumo:
Stream ciphers are symmetric key cryptosystems that are used commonly to provide confidentiality for a wide range of applications; such as mobile phone, pay TV and Internet data transmissions. This research examines the features and properties of the initialisation processes of existing stream ciphers to identify flaws and weaknesses, then presents recommendations to improve the security of future cipher designs. This research investigates well-known stream ciphers: A5/1, Sfinks and the Common Scrambling Algorithm Stream Cipher (CSA-SC). This research focused on the security of the initialisation process. The recommendations given are based on both the results in the literature and the work in this thesis.
Resumo:
Technological advances have led to an influx of affordable hardware that supports sensing, computation and communication. This hardware is increasingly deployed in public and private spaces, tracking and aggregating a wealth of real-time environmental data. Although these technologies are the focus of several research areas, there is a lack of research dealing with the problem of making these capabilities accessible to everyday users. This thesis represents a first step towards developing systems that will allow users to leverage the available infrastructure and create custom tailored solutions. It explores how this notion can be utilized in the context of energy monitoring to improve conventional approaches. The project adopted a user-centered design process to inform the development of a flexible system for real-time data stream composition and visualization. This system features an extensible architecture and defines a unified API for heterogeneous data streams. Rather than displaying the data in a predetermined fashion, it makes this information available as building blocks that can be combined and shared. It is based on the insight that individual users have diverse information needs and presentation preferences. Therefore, it allows users to compose rich information displays, incorporating personally relevant data from an extensive information ecosystem. The prototype was evaluated in an exploratory study to observe its natural use in a real-world setting, gathering empirical usage statistics and conducting semi-structured interviews. The results show that a high degree of customization does not warrant sustained usage. Other factors were identified, yielding recommendations for increasing the impact on energy consumption.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Enormous progress has been made towards understanding the role of specific factors in the process of epithelial-mesenchymal transition (EMT); however, the complex underlying pathways and the transient nature of the transition continues to present significant challenges. Targeting tumour cell plasticity underpinning EMT is an attractive strategy to combat metastasis. Global gene expression profiling and high-content analyses are among the strategies employed to identify novel EMT regulators. In this review, we highlight several approaches to systematically interrogate key pathways involved in EMT, with particular emphasis on the features of multiparametric, high-content imaging screening strategies that lend themselves to the systematic discovery of highly significant modulators of tumour cell plasticity.
Resumo:
Reliable robotic perception and planning are critical to performing autonomous actions in uncertain, unstructured environments. In field robotic systems, automation is achieved by interpreting exteroceptive sensor information to infer something about the world. This is then mapped to provide a consistent spatial context, so that actions can be planned around the predicted future interaction of the robot and the world. The whole system is as reliable as the weakest link in this chain. In this paper, the term mapping is used broadly to describe the transformation of range-based exteroceptive sensor data (such as LIDAR or stereo vision) to a fixed navigation frame, so that it can be used to form an internal representation of the environment. The coordinate transformation from the sensor frame to the navigation frame is analyzed to produce a spatial error model that captures the dominant geometric and temporal sources of mapping error. This allows the mapping accuracy to be calculated at run time. A generic extrinsic calibration method for exteroceptive range-based sensors is then presented to determine the sensor location and orientation. This allows systematic errors in individual sensors to be minimized, and when multiple sensors are used, it minimizes the systematic contradiction between them to enable reliable multisensor data fusion. The mathematical derivations at the core of this model are not particularly novel or complicated, but the rigorous analysis and application to field robotics seems to be largely absent from the literature to date. The techniques in this paper are simple to implement, and they offer a significant improvement to the accuracy, precision, and integrity of mapped information. Consequently, they should be employed whenever maps are formed from range-based exteroceptive sensor data. © 2009 Wiley Periodicals, Inc.
Resumo:
This paper uses innovative content analysis techniques to map how the death of Oscar Pistorius' girlfriend, Reeva Steenkamp, was framed on Twitter conversations. Around 1.5 million posts from a two-week timeframe are analyzed with a combination of syntactic and semantic methods. This analysis is grounded in the frame analysis perspective and is different than sentiment analysis. Instead of looking for explicit evaluations, such as “he is guilty” or “he is innocent”, we showcase through the results how opinions can be identified by complex articulations of more implicit symbolic devices such as examples and metaphors repeatedly mentioned. Different frames are adopted by users as more information about the case is revealed: from a more episodic one, highly used in the very beginning, to more systemic approaches, highlighting the association of the event with urban violence, gun control issues, and violence against women. A detailed timeline of the discussions is provided.