948 resultados para fixed numbers
Resumo:
We introduce a function Z(k) which measures the number of distinct ways in which a number can be expressed as the sum of Fibonacci numbers. Using a binary table and other devices, we explore the values that Z(k) can take and reveal a surprising relationship between the values of Z(k) and the Fibonacci numbers from which they were derived. The article shows the way in which standard spreadsheet functionalities makes it possible to reveal quite striking patterns in data.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper considers an implementation in Excel 2010 and its potential as a vehicle to showcase a range of mathematical and computing concepts and principles.
Resumo:
The television quiz program Letters and Numbers, broadcast on the SBS network, has recently become quite popular in Australia. This paper explores the potential of this game to illustrate and engage student interest in a range of fundamental concepts of computer science and mathematics. The Numbers Game in particular has a rich mathematical structure whose analysis and solution involves concepts of counting and problem size, discrete (tree) structures, language theory, recurrences, computational complexity, and even advanced memory management. This paper presents an analysis of these games and their teaching applications, and presents some initial results of use in student assignments.
Resumo:
Aim To establish the suitability of multiplex tandem polymerase chain reaction (MT-PCR) for rapid identification of oestrogen receptor (ER) and Her-2 status using a single, formalin-fixed, paraffin-embedded (FFPE) breast tumour section. Methods Tissue sections from 29 breast tumours were analysed by immunohistochemistry (IHC) and fluorescence in situ hybridisation (FISH). RNA extracted from 10μm FFPE breast tumour sections from 24 of 29 tumours (14 ER positive and 5 Her-2 positive) was analysed by MT-PCR. After establishing a correlation between IHC and/or FISH and MT-PCR results, the ER/Her-2 status of a further 32 randomly selected, archival breast tumour specimens was established by MT-PCR in a blinded fashion, and compared to IHC/FISH results. Results MT-PCR levels of ER and Her-2 showed good concordance with IHC and FISH results. Furthermore, among the ER positive tumours, MT-PCR provided a quantitative score with a high dynamic range. Threshold values obtained from this data set applied to 32 archival tumour specimens showed that tumours strongly positive for ER and/or Her-2 expression were easily identified by MT-PCR. Conclusion MT-PCR can provide rapid, sensitive and cost-effective analysis of FFPE material and may prove useful as triage to identify patients suited to endocrine or trastuzumab (Herceptin) treatment.
Resumo:
This is an update of an earlier paper, and is written for Excel 2007. A series of Excel 2007 models is described. The more advanced versions allow solution of f(x)=0 by examining change of sign of function values. The function is graphed and change of sign easily detected by a change of colour. Relevant features of Excel 2007 used are Names, Scatter Chart and Conditional Formatting. Several sample Excel 2007 models are available for download, and the paper is intended to be used as a lesson plan for students having some familiarity with derivatives. For comparison and reference purposes, the paper also presents a brief outline of several common equation-solving strategies as an Appendix.
Resumo:
Ever since Cox et. al published their paper, “A Secure, Robust Watermark for Multimedia” in 1996 [6], there has been tremendous progress in multimedia watermarking. The same pattern re-emerged with Agrawal and Kiernan publishing their work “Watermarking Relational Databases” in 2001 [1]. However, little attention has been given to primitive data collections with only a handful works of research known to the authors [11, 10]. This is primarily due to the absence of an attribute that differentiates marked items from unmarked item during insertion and detection process. This paper presents a distribution-independent, watermarking model that is secure against secondary-watermarking in addition to conventional attacks such as data addition, deletion and distortion. The low false positives and high capacity provide additional strength to the scheme. These claims are backed by experimental results provided in the paper.
Resumo:
The aim of this paper is to determine the strain-rate-dependent mechanical behavior of living and fixed osteocytes and chondrocytes, in vitro. Firstly, Atomic Force Microscopy (AFM) was used to obtain the force-indentation curves of these single cells at four different strain-rates. These results were then employed in inverse finite element analysis (FEA) using Modified Standard neo-Hookean Solid (MSnHS) idealization of these cells to determine their mechanical properties. In addition, a FEA model with a newly developed spring element was employed to accurately simulate AFM evaluation in this study. We report that both cytoskeleton (CSK) and intracellular fluid govern the strain-rate-dependent mechanical property of living cells whereas intracellular fluid plays a predominant role on fixed cells’ behavior. In addition, through the comparisons, it can be concluded that osteocytes are stiffer than chondrocytes at all strain-rates tested indicating that the cells could be the biomarker of their tissue origin. Finally, we report that MSnHS is able to capture the strain-rate-dependent mechanical behavior of osteocyte and chondrocyte for both living and fixed cells. Therefore, we concluded that the MSnHS is a good model for exploration of mechanical deformation responses of single osteocytes and chondrocytes. This study could open a new avenue for analysis of mechanical behavior of osteocytes and chondrocytes as well as other similar types of cells.
Resumo:
This study analyses and compares the cost efficiency of Japanese steam power generation companies using the fixed and random Bayesian frontier models. We show that it is essential to account for heterogeneity in modelling the performance of energy companies. Results from the model estimation also indicate that restricting CO2 emissions can lead to a decrease in total cost. The study finally discusses the efficiency variations between the energy companies under analysis, and elaborates on the managerial and policy implications of the results.
Resumo:
This paper examines how ideas and practices of accounting come together in turning the abstract concept of climate change into a new non-financial performance measure in a large energy company in the UK. It develops the notion of ‘governmental management’ to explain how the firm’s carbon dioxide emissions were transformed into a new organisational object that could be made quantifiable, measureable and ultimately manageable because of the modern power of accounting in tying disciplinary subjectivities and objectivities together whilst operating simultaneously at the level of individual and the organisation. Examining these interrelations highlights the constitutive nature of accounting in creating not just new categories for accounting’s attention, but in turn new organisational knowledge and knowledge experts in the making up accounting for climate change. Significantly, it appears these new knowledge experts are no longer accountants: which may help explain accounting’s evolution into evermore spheres of influence as we increasingly choose to manage our world ‘by the numbers’.
Resumo:
This book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems.
Resumo:
Researchers spend an average of 38 working days preparing an NHMRC Project Grant proposal, but with success rates of just 15% then over 500 years of researcher went into failed applications in 2014. This time would likely have been better spent on actual research. Many applications are non-competitive and could possibly be culled early, saving time for both researchers and funding agencies. Our analysis of the major health and medical scheme in Australia estimated that 61% of applications were never likely to be funded...
Resumo:
Purpose of this paper This research aims to examine the effects of inadequate documentation to the cost management & tendering processes in Managing Contractor Contracts using Fixed Lump Sum as a benchmark. Design/methodology/approach A questionnaire survey was conducted with industry practitioners to solicit their views on documentation quality issues associated with the construction industry. This is followed by a series of semi-structured interviews with a purpose of validating survey findings. Findings and value The results showed that documentation quality remains a significant issue, contributing to the industries inefficiency and poor reputation. The level of satisfaction for individual attributes of documentation quality varies. Attributes that do appear to be affected by the choice of procurement method include coordination, build ability, efficiency, completeness and delivery time. Similarly the use and effectiveness of risk mitigation techniques appears to vary between the methods, based on a number of factors such as documentation completeness, early involvement, fast tracking etc. Originality/value of paper This research fills the gap of existing body of knowledge in terms of limited studies on the choice of a project procurement system has an influence on the documentation quality and the level of impact. Conclusions Ultimately research concludes that the entire project team including the client and designers should carefully consider the individual projects requirements and compare those to the trade-offs associated with documentation quality and the procurement method. While documentation quality is definitely an issue to be improved upon, by identifying the projects performance requirements a procurement method can be chosen to maximise the likelihood that those requirements will be met. This allows the aspects of documentation quality considered most important to the individual project to be managed appropriately.
Resumo:
This paper discusses three different ways of applying the single-objective binary genetic algorithm into designing the wind farm. The introduction of different applications is through altering the binary encoding methods in GA codes. The first encoding method is the traditional one with fixed wind turbine positions. The second involves varying the initial positions from results of the first method, and it is achieved by using binary digits to represent the coordination of wind turbine on X or Y axis. The third is the mixing of the first encoding method with another one, which is by adding four more binary digits to represent one of the unavailable plots. The goal of this paper is to demonstrate how the single-objective binary algorithm can be applied and how the wind turbines are distributed under various conditions with best fitness. The main emphasis of discussion is focused on the scenario of wind direction varying from 0° to 45°. Results show that choosing the appropriate position of wind turbines is more significant than choosing the wind turbine numbers, considering that the former has a bigger influence on the whole farm fitness than the latter. And the farm has best performance of fitness values, farm efficiency, and total power with the direction between 20°to 30°.
Resumo:
This paper presents a visual SLAM method for temporary satellite dropout navigation, here applied on fixed- wing aircraft. It is designed for flight altitudes beyond typical stereo ranges, but within the range of distance measurement sensors. The proposed visual SLAM method consists of a common localization step with monocular camera resectioning, and a mapping step which incorporates radar altimeter data for absolute scale estimation. With that, there will be no scale drift of the map and the estimated flight path. The method does not require simplifications like known landmarks and it is thus suitable for unknown and nearly arbitrary terrain. The method is tested with sensor datasets from a manned Cessna 172 aircraft. With 5% absolute scale error from radar measurements causing approximately 2-6% accumulation error over the flown distance, stable positioning is achieved over several minutes of flight time. The main limitations are flight altitudes above the radar range of 750 m where the monocular method will suffer from scale drift, and, depending on the flight speed, flights below 50 m where image processing gets difficult with a downwards-looking camera due to the high optical flow rates and the low image overlap.