172 resultados para fixed point formulae
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.
Resumo:
Rigid lenses, which were originally made from glass (between 1888 and 1940) and later from polymethyl methacrylate or silicone acrylate materials, are uncomfortable to wear and are now seldom fitted to new patients. Contact lenses became a popular mode of ophthalmic refractive error correction following the discovery of the first hydrogel material – hydroxyethyl methacrylate – by Czech chemist Otto Wichterle in 1960. To satisfy the requirements for ocular biocompatibility, contact lenses must be transparent and optically stable (for clear vision), have a low elastic modulus (for good comfort), have a hydrophilic surface (for good wettability), and be permeable to certain metabolites, especially oxygen, to allow for normal corneal metabolism and respiration during lens wear. A major breakthrough in respect of the last of these requirements was the development of silicone hydrogel soft lenses in 1999 and techniques for making the surface hydrophilic. The vast majority of contact lenses distributed worldwide are mass-produced using cast molding, although spin casting is also used. These advanced mass-production techniques have facilitated the frequent disposal of contact lenses, leading to improvements in ocular health and fewer complications. More than one-third of all soft contact lenses sold today are designed to be discarded daily (i.e., ‘daily disposable’ lenses).
Resumo:
We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.
Resumo:
Loop detectors are the oldest and widely used traffic data source. On urban arterials, they are mainly installed for signal control. Recently state of the art Bluetooth MAC Scanners (BMS) has significantly captured the interest of stakeholders for exploiting it for area wide traffic monitoring. Loop detectors provide flow- a fundamental traffic parameter; whereas BMS provides individual vehicle travel time between BMS stations. Hence, these two data sources complement each other, and if integrated should increase the accuracy and reliability of the traffic state estimation. This paper proposed a model that integrates loops and BMS data for seamless travel time and density estimation for urban signalised network. The proposed model is validated using both real and simulated data and the results indicate that the accuracy of the proposed model is over 90%.
Resumo:
Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).
Resumo:
The effects of crack depth (a/W) and specimen width W on the fracture toughness and ductile±brittle transition have been investigated using three-point bend specimens. Finite element analysis is employed to obtain the stress-strain fields ahead of the crack tip. The results show that both normalized crack depth (a/W) and specimen width (W) affect the fracture toughness and ductile±brittle fracture transition. The measured crack tip opening displacement decreases and ductile±brittle transition occurs with increasing crack depth (a/W) from 0.1 to 0.2 and 0.3. At a fixed a/W (0.2 or 0.3), all specimens fail by cleavage prior to ductile tearing when specimen width W increases from 25 to 40 and 50 mm. The lower bound fracture toughness is not sensitive to crack depth and specimen width. Finite element analysis shows that the opening stress in the remaining ligament is elevated with increasing crack depth or specimen width due to the increase of in-plane constraint. The average local cleavage stress is dependent on both crack depth and specimen width but its lower bound value is not sensitive to constraint level. No fixed distance can be found from the cleavage initiation site to the crack tip and this distance increases gradually with decreasing inplane constraint.
Resumo:
Aim To establish the suitability of multiplex tandem polymerase chain reaction (MT-PCR) for rapid identification of oestrogen receptor (ER) and Her-2 status using a single, formalin-fixed, paraffin-embedded (FFPE) breast tumour section. Methods Tissue sections from 29 breast tumours were analysed by immunohistochemistry (IHC) and fluorescence in situ hybridisation (FISH). RNA extracted from 10μm FFPE breast tumour sections from 24 of 29 tumours (14 ER positive and 5 Her-2 positive) was analysed by MT-PCR. After establishing a correlation between IHC and/or FISH and MT-PCR results, the ER/Her-2 status of a further 32 randomly selected, archival breast tumour specimens was established by MT-PCR in a blinded fashion, and compared to IHC/FISH results. Results MT-PCR levels of ER and Her-2 showed good concordance with IHC and FISH results. Furthermore, among the ER positive tumours, MT-PCR provided a quantitative score with a high dynamic range. Threshold values obtained from this data set applied to 32 archival tumour specimens showed that tumours strongly positive for ER and/or Her-2 expression were easily identified by MT-PCR. Conclusion MT-PCR can provide rapid, sensitive and cost-effective analysis of FFPE material and may prove useful as triage to identify patients suited to endocrine or trastuzumab (Herceptin) treatment.
Resumo:
This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the biascorrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.
Resumo:
The aim of this paper is to determine the strain-rate-dependent mechanical behavior of living and fixed osteocytes and chondrocytes, in vitro. Firstly, Atomic Force Microscopy (AFM) was used to obtain the force-indentation curves of these single cells at four different strain-rates. These results were then employed in inverse finite element analysis (FEA) using Modified Standard neo-Hookean Solid (MSnHS) idealization of these cells to determine their mechanical properties. In addition, a FEA model with a newly developed spring element was employed to accurately simulate AFM evaluation in this study. We report that both cytoskeleton (CSK) and intracellular fluid govern the strain-rate-dependent mechanical property of living cells whereas intracellular fluid plays a predominant role on fixed cells’ behavior. In addition, through the comparisons, it can be concluded that osteocytes are stiffer than chondrocytes at all strain-rates tested indicating that the cells could be the biomarker of their tissue origin. Finally, we report that MSnHS is able to capture the strain-rate-dependent mechanical behavior of osteocyte and chondrocyte for both living and fixed cells. Therefore, we concluded that the MSnHS is a good model for exploration of mechanical deformation responses of single osteocytes and chondrocytes. This study could open a new avenue for analysis of mechanical behavior of osteocytes and chondrocytes as well as other similar types of cells.
Resumo:
BRAF represents one of the most frequently mutated protein kinase genes in human tumours. The mutation is commonly tested in pathology practice. BRAF mutation is seen in melanoma, papillary thyroid carcinoma (including papillary thyroid carcinoma arising from ovarian teratoma), ovarian serous tumours, colorectal carcinoma, gliomas, hepatobiliary carcinomas and hairy cell leukaemia. In these cancers, various genetic aberrations of the BRAF proto-oncogene, such as different point mutations and chromosomal rearrangements, have been reported. The most common mutation, BRAF V600E, can be detected by DNA sequencing and immunohistochemistry on formalin fixed, paraffin embedded tumour tissue. Detection of BRAF V600E mutation has the potential for clinical use as a diagnostic and prognostic marker. In addition, a great deal of research effort has been spent in strategies inhibiting its activity. Indeed, recent clinical trials involving BRAF selective inhibitors exhibited promising response rates in metastatic melanoma patients. Clinical trials are underway for other cancers. However, cutaneous side effects of treatment have been reported and therapeutic response to cancer is short-lived due to the emergence of several resistance mechanisms. In this review, we give an update on the clinical pathological relevance of BRAF mutation in cancer. It is hoped that the review will enhance the direction of future research and assist in more effective use of the knowledge of BRAF mutation in clinical practice.
Resumo:
This study reports on the utilisation of the Manchester Driver Behaviour Questionnaire (DBQ) to examine the self-reported driving behaviours of a large sample of Australian fleet drivers (N = 3414). Surveys were completed by employees before they commenced a one day safety workshop intervention. Factor analysis techniques identified a three factor solution similar to previous research, which was comprised of: (a) errors, (b) highway-code violations and (c) aggressive driving violations. Two items traditionally related with highway-code violations were found to be associated with aggressive driving behaviours among the current sample. Multivariate analyses revealed that exposure to the road, errors and self-reported offences predicted crashes at work in the last 12 months, while gender, highway violations and crashes predicted offences incurred while at work. Importantly, those who received more fines at work were at an increased risk of crashing the work vehicle. However, overall, the DBQ demonstrated limited efficacy at predicting these two outcomes. This paper outlines the major findings of the study in regards to identifying and predicting aberrant driving behaviours and also highlights implications regarding the future utilisation of the DBQ within fleet settings.
Resumo:
We investigate whether framing effects of voluntary contributions are significant in a provision point mechanism. Our results show that framing significantly affects individuals of the same type: cooperative individuals appear to be more cooperative in the public bads game than in the public goods game, whereas individualistic subjects appear to be less cooperative in the public bads game than in the public goods game. At the aggregate level of pooling all individuals, the data suggests that framing effects are negligible, which is in contrast with the established result.
Resumo:
This study analyses and compares the cost efficiency of Japanese steam power generation companies using the fixed and random Bayesian frontier models. We show that it is essential to account for heterogeneity in modelling the performance of energy companies. Results from the model estimation also indicate that restricting CO2 emissions can lead to a decrease in total cost. The study finally discusses the efficiency variations between the energy companies under analysis, and elaborates on the managerial and policy implications of the results.
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.