969 resultados para attori, concorrenza, COOP, Akka, benchmark
Resumo:
Mortality models used for forecasting are predominantly based on the statistical properties of time series and do not generally incorporate an understanding of the forces driving secular trends. This paper addresses three research questions: Can the factors found in stochastic mortality-forecasting models be associated with real-world trends in health-related variables? Does inclusion of health-related factors in models improve forecasts? Do resulting models give better forecasts than existing stochastic mortality models? We consider whether the space spanned by the latent factor structure in mortality data can be adequately described by developments in gross domestic product, health expenditure and lifestyle-related risk factors using statistical techniques developed in macroeconomics and finance. These covariates are then shown to improve forecasts when incorporated into a Bayesian hierarchical model. Results are comparable or better than benchmark stochastic mortality models.
Resumo:
Shape memory NiTi alloys have been used extensively for medical device applications such as orthopedic, dental, vascular and cardiovascular devices on account of their unique shape memory effect (SME) and super-elasticity (SE). Laser welding is found to be the most suitable method used to fabricate NiTi-based medical components. However, the performance of laser-welded NiTi alloys under corrosive environments is not fully understood and a specific focus on understanding the corrosion fatigue behaviour is not evident in the literature. This study reveals a comparison of corrosion fatigue behaviour of laser-welded and bare NiTi alloys using bending rotation fatigue (BRF) test which was integrated with a specifically designed corrosion cell. The testing environment was Hanks’ solution (simulated body fluid) at 37.5oC. Electrochemical impedance spectroscopic (EIS) measurement was carried out to monitor the change of corrosion resistance at different periods during the BRF test. Experiments indicate that the laser-welded NiTi alloy would be more susceptible to the corrosion fatigue attack than the bare NiTi alloy. This finding can serve as a benchmark for the product designers and engineers to determine the factor of safety of NiTi medical devices fabricated using laser welding.
Resumo:
Self-injurious and aggressive behaviours have often been identified as the cause for students’ lack of academic progress, parental distress, health risks and teachers´ low satisfaction levels. Functional analysis has been identified in the research literature as the benchmark of effective treatments for disruptive and/or inappropriate behaviours. The present study was completed with a girl diagnosed with ASD. An experimental functional analysis was conducted identifying the function of self-injurious behaviours and tantrums to be escaping from tasks. A treatment package was consequently put in place integrating several components that aimed at reducing overall levels of inappropriate behaviours. Results showed a clear and meaningful improvement in the student´s overall health and academic progress, as well as in parental involvement, teachers’ satisfaction and school inclusion. These outcomes are discussed in the light of evidence-based experimental procedures based on applied behaviour analysis and more specifically on the functional-analytic literature, which, if put in place consistently, can bring valuable positive changes in the quality of life of individuals with ASD.
Resumo:
The photophysics of the green fluorescent protein is governed by the electronic structure of the chromophore at the heart of its β-barrel protein structure. We present the first two-color, resonance-enhanced, multiphoton ionization spectrum of the isolated neutral chromophore in vacuo with supporting electronic structure calculations. We find the absorption maximum to be 3.65 ± 0.05 eV (340 ± 5 nm), which is blue-shifted by 0.5 eV (55 nm) from the absorption maximum of the protein in its neutral form. Our results show that interactions between the chromophore and the protein have a significant influence on the electronic structure of the neutral chromophore during photoabsorption and provide a benchmark for the rational design of novel chromophores as fluorescent markers or photomanipulators.
Resumo:
The development of smart grid technologies and appropriate charging strategies are key to accommodating large numbers of Electric Vehicles (EV) charging on the grid. In this paper a general framework is presented for formulating the EV charging optimization problem and three different charging strategies are investigated and compared from the perspective of charging fairness while taking into account power system constraints. Two strategies are based on distributed algorithms, namely, Additive Increase and Multiplicative Decrease (AIMD), and Distributed Price-Feedback (DPF), while the third is an ideal centralized solution used to benchmark performance. The algorithms are evaluated using a simulation of a typical residential low voltage distribution network with 50% EV penetration. © 2013 IEEE.
Resumo:
Increasingly semiconductor manufacturers are exploring opportunities for virtual metrology (VM) enabled process monitoring and control as a means of reducing non-value added metrology and achieving ever more demanding wafer fabrication tolerances. However, developing robust, reliable and interpretable VM models can be very challenging due to the highly correlated input space often associated with the underpinning data sets. A particularly pertinent example is etch rate prediction of plasma etch processes from multichannel optical emission spectroscopy data. This paper proposes a novel input-clustering based forward stepwise regression methodology for VM model building in such highly correlated input spaces. Max Separation Clustering (MSC) is employed as a pre-processing step to identify a reduced srt of well-conditioned, representative variables that can then be used as inputs to state-of-the-art model building techniques such as Forward Selection Regression (FSR), Ridge regression, LASSO and Forward Selection Ridge Regression (FCRR). The methodology is validated on a benchmark semiconductor plasma etch dataset and the results obtained are compared with those achieved when the state-of-art approaches are applied directly to the data without the MSC pre-processing step. Significant performance improvements are observed when MSC is combined with FSR (13%) and FSRR (8.5%), but not with Ridge Regression (-1%) or LASSO (-32%). The optimal VM results are obtained using the MSC-FSR and MSC-FSRR generated models. © 2012 IEEE.
Resumo:
In this paper a multiple classifier machine learning methodology for Predictive Maintenance (PdM) is presented. PdM is a prominent strategy for dealing with maintenance issues given the increasing need to minimize downtime and associated costs. One of the challenges with PdM is generating so called ’health factors’ or quantitative indicators of the status of a system associated with a given maintenance issue, and determining their relationship to operating costs and failure risk. The proposed PdM methodology allows dynamical decision rules to be adopted for maintenance management and can be used with high-dimensional and censored data problems. This is achieved by training multiple classification modules with different prediction horizons to provide different performance trade-offs in terms of frequency of unexpected breaks and unexploited lifetime and then employing this information in an operating cost based maintenance decision system to minimise expected costs. The effectiveness of the methodology is demonstrated using a simulated example and a benchmark semiconductor manufacturing maintenance problem.
Resumo:
Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. The prediction models for VM can be from a large variety of linear and nonlinear regression methods and the selection of a proper regression method for a specific VM problem is not straightforward, especially when the candidate predictor set is of high dimension, correlated and noisy. Using process data from a benchmark semiconductor manufacturing process, this paper evaluates the performance of four typical regression methods for VM: multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), neural networks (NN) and Gaussian process regression (GPR). It is observed that GPR performs the best among the four methods and that, remarkably, the performance of linear regression approaches that of GPR as the subset of selected input variables is increased. The observed competitiveness of high-dimensional linear regression models, which does not hold true in general, is explained in the context of extreme learning machines and functional link neural networks.
Resumo:
Objective: Molecular pathology relies on identifying anomalies using PCR or analysis of DNA/RNA. This is important in solid tumours where molecular stratification of patients define targeted treatment. These molecular biomarkers rely on examination of tumour, annotation for possible macro dissection/tumour cell enrichment and the estimation of % tumour. Manually marking up tumour is error prone. Method: We have developed a method for automated tumour mark-up and % cell calculations using image analysis called TissueMark® based on texture analysis for lung, colorectal and breast (cases=245, 100, 100 respectively). Pathologists marked slides for tumour and reviewed the automated analysis. A subset of slides was manually counted for tumour cells to provide a benchmark for automated image analysis. Results: There was a strong concordance between pathological and automated mark-up (100 % acceptance rate for macro-dissection). We also showed a strong concordance between manually/automatic drawn boundaries (median exclusion/inclusion error of 91.70 %/89 %). EGFR mutation analysis was precisely the same for manual and automated annotation-based macrodissection. The annotation accuracy rates in breast and colorectal cancer were 83 and 80 % respectively. Finally, region-based estimations of tumour percentage using image analysis showed significant correlation with actual cell counts. Conclusion: Image analysis can be used for macro-dissection to (i) annotate tissue for tumour and (ii) estimate the % tumour cells and represents an approach to standardising/improving molecular diagnostics.
Resumo:
The original goals of the JET ITER-like wall included the study of the impact of an all W divertor on plasma operation (Coenen et al 2013 Nucl. Fusion 53 073043) and fuel retention (Brezinsek et al 2013 Nucl. Fusion 53 083023). ITER has recently decided to install a full-tungsten (W) divertor from the start of operations. One of the key inputs required in support of this decision was the study of the possibility of W melting and melt splashing during transients. Damage of this type can lead to modifications of surface topology which could lead to higher disruption frequency or compromise subsequent plasma operation. Although every effort will be made to avoid leading edges, ITER plasma stored energies are sufficient that transients can drive shallow melting on the top surfaces of components. JET is able to produce ELMs large enough to allow access to transient melting in a regime of relevance to ITER.
Transient W melt experiments were performed in JET using a dedicated divertor module and a sequence of I-P = 3.0 MA/B-T = 2.9 T H-mode pulses with an input power of P-IN = 23 MW, a stored energy of similar to 6 MJ and regular type I ELMs at Delta W-ELM = 0.3 MJ and f(ELM) similar to 30 Hz. By moving the outer strike point onto a dedicated leading edge in the W divertor the base temperature was raised within similar to 1 s to a level allowing transient, ELM-driven melting during the subsequent 0.5 s. Such ELMs (delta W similar to 300 kJ per ELM) are comparable to mitigated ELMs expected in ITER (Pitts et al 2011 J. Nucl. Mater. 415 (Suppl.) S957-64).
Although significant material losses in terms of ejections into the plasma were not observed, there is indirect evidence that some small droplets (similar to 80 mu m) were released. Almost 1 mm (similar to 6 mm(3)) of W was moved by similar to 150 ELMs within 7 subsequent discharges. The impact on the main plasma parameters was minor and no disruptions occurred. The W-melt gradually moved along the leading edge towards the high-field side, driven by j x B forces. The evaporation rate determined from spectroscopy is 100 times less than expected from steady state melting and is thus consistent only with transient melting during the individual ELMs. Analysis of IR data and spectroscopy together with modelling using the MEMOS code Bazylev et al 2009 J. Nucl. Mater. 390-391 810-13 point to transient melting as the main process. 3D MEMOS simulations on the consequences of multiple ELMs on damage of tungsten castellated armour have been performed.
These experiments provide the first experimental evidence for the absence of significant melt splashing at transient events resembling mitigated ELMs on ITER and establish a key experimental benchmark for the MEMOS code.
Resumo:
Fully Homomorphic Encryption (FHE) is a recently developed cryptographic technique which allows computations on encrypted data. There are many interesting applications for this encryption method, especially within cloud computing. However, the computational complexity is such that it is not yet practical for real-time applications. This work proposes optimised hardware architectures of the encryption step of an integer-based FHE scheme with the aim of improving its practicality. A low-area design and a high-speed parallel design are proposed and implemented on a Xilinx Virtex-7 FPGA, targeting the available DSP slices, which offer high-speed multiplication and accumulation. Both use the Comba multiplication scheduling method to manage the large multiplications required with uneven sized multiplicands and to minimise the number of read and write operations to RAM. Results show that speed up factors of 3.6 and 10.4 can be achieved for the encryption step with medium-sized security parameters for the low-area and parallel designs respectively, compared to the benchmark software implementation on an Intel Core2 Duo E8400 platform running at 3 GHz.
Resumo:
For some time, the satisfiability formulae that have been the most difficult to solve for their size have been crafted to be unsatisfiable by the use of cardinality constraints. Recent solvers have introduced explicit checking of such constraints, rendering previously difficult formulae trivial to solve. A family of unsatisfiable formulae is described that is derived from the sgen4 family but cannot be solved using cardinality constraints detection and reasoning alone. These formulae were found to be the most difficult during the SAT2014 competition by a significant margin and include the shortest unsolved benchmark in the competition, sgen6-1200-5-1.cnf.
Resumo:
One of the most widely used techniques in computer vision for foreground detection is to model each background pixel as a Mixture of Gaussians (MoG). While this is effective for a static camera with a fixed or a slowly varying background, it fails to handle any fast, dynamic movement in the background. In this paper, we propose a generalised framework, called region-based MoG (RMoG), that takes into consideration neighbouring pixels while generating the model of the observed scene. The model equations are derived from Expectation Maximisation theory for batch mode, and stochastic approximation is used for online mode updates. We evaluate our region-based approach against ten sequences containing dynamic backgrounds, and show that the region-based approach provides a performance improvement over the traditional single pixel MoG. For feature and region sizes that are equal, the effect of increasing the learning rate is to reduce both true and false positives. Comparison with four state-of-the art approaches shows that RMoG outperforms the others in reducing false positives whilst still maintaining reasonable foreground definition. Lastly, using the ChangeDetection (CDNet 2014) benchmark, we evaluated RMoG against numerous surveillance scenes and found it to amongst the leading performers for dynamic background scenes, whilst providing comparable performance for other commonly occurring surveillance scenes.
Resumo:
Comet C/2012 S1 (ISON) is unique in that it is a dynamically new comet derived from the Oort cloud reservoir of comets with a sun-grazing orbit. Infrared (IR) and visible wavelength observing campaigns were planned on NASA's Stratospheric Observatory For Infrared Astronomy (SOFIA) and on National Solar Observatory Dunn (DST) and McMath-Pierce Solar Telescopes, respectively. We highlight our early results. SOFIA (+FORCAST [1]) mid- to far-IR images and spectroscopy (~5-35 μm) of the dust in the coma of ISON are to be obtained by the ISON-SOFIA Team during a flight window 2013 Oct 21-23 UT (r_h≈1.18 AU). Dust characteristics, identified through the 10 μm silicate emission feature and its strength [2], as well as spectral features from cometary crystalline silicates (Forsterite) at 11.05-11.2 μm, and near 16, 19, 23.5, 27.5, and 33 μm are compared with other Oort cloud comets that span the range of small and/or highly porous grains (e.g., C/1995 O1 (Hale-Bopp) [3,4,5] and C/2001 Q4 (NEAT) [6]) to large and/or compact grains (e.g., C/2007 N4 (Lulin) [7] and C/2006 P1 (McNaught) [8]). Measurement of the crystalline peaks in contrast to the broad 10 and 20 μm amorphous silicate features yields the cometary silicate crystalline mass fraction [9], which is a benchmark for radial transport in our protoplanetary disk [10]. The central wavelength positions, relative intensities, and feature asymmetries for the crystalline peaks may constrain the shapes of the crystals [11]. Only SOFIA can look for cometary organics in the 5-8 μm region. Spatially resolved measurements of atoms and simple molecules from when comet ISON is near the Sun (r_h<0.4 AU, near Nov-20--Dec-03 UT) were proposed for by the ISON-DST Team. Comet ISON is the first comet since comet Ikeya-Seki (1965f) [12,13] suitable for studying the alkalai metals Na and K and the atoms specifically attributed to dust grains including Mg, Si, Fe, as well as Ca. DST's Horizontal Grating Spectrometer (HGS) measures 4 settings: Na I, K, C2 to sample cometary organics (along with Mg I), and [O I] as a proxy for activity from water [14] (along with Si I and Fe I). State-of-the-art instruments that will also be employed include IBIS [15], which is a Fabry-Perot spectral imaging system that concurrently measures lines of Na, K, Ca II, or Fe, and ROSA (CSUN/QUB) [16], which is a rapid imager that simultaneously monitors Ca II or CN. From McMath-Pierce, the Solar-Stellar Spectrograph also will target ISON (320-900 nm, R~21,000, r_h
Resumo:
Energy consumption has become an important area of research of late. With the advent of new manycore processors, situations have arisen where not all the processors need to be active to reach an optimal relation between performance and energy usage. In this paper a study of the power and energy usage of a series of benchmarks, the PARSEC and the SPLASH- 2X Benchmark Suites, on the Intel Xeon Phi for different threads configurations, is presented. To carry out this study, a tool was designed to monitor and record the power usage in real time during execution time and afterwards to compare the r