357 resultados para Proximal Point Algorithm
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
This paper evaluates the performances of prediction intervals generated from alternative time series models, in the context of tourism forecasting. The forecasting methods considered include the autoregressive (AR) model, the AR model using the bias-corrected bootstrap, seasonal ARIMA models, innovations state space models for exponential smoothing, and Harvey’s structural time series models. We use thirteen monthly time series for the number of tourist arrivals to Hong Kong and Australia. The mean coverage rates and widths of the alternative prediction intervals are evaluated in an empirical setting. It is found that all models produce satisfactory prediction intervals, except for the autoregressive model. In particular, those based on the biascorrected bootstrap perform best in general, providing tight intervals with accurate coverage rates, especially when the forecast horizon is long.
Resumo:
To harness safe operation of Web-based systems in Web environments, we propose an SSPA (Server-based SHA-1 Page-digest Algorithm) to verify the integrity of Web contents before the server issues an HTTP response to a user request. In addition to standard security measures, our Java implementation of the SSPA, which is called the Dynamic Security Surveillance Agent (DSSA), provides further security in terms of content integrity to Web-based systems. Its function is to prevent the display of Web contents that have been altered through the malicious acts of attackers and intruders on client machines. This is to protect the reputation of organisations from cyber-attacks and to ensure the safe operation of Web systems by dynamically monitoring the integrity of a Web site's content on demand. We discuss our findings in terms of the applicability and practicality of the proposed system. We also discuss its time metrics, specifically in relation to its computational overhead at the Web server, as well as the overall latency from the clients' point of view, using different Internet access methods. The SSPA, our DSSA implementation, some experimental results and related work are all discussed
Resumo:
One-time proxy signatures are one-time signatures for which a primary signer can delegate his or her signing capability to a proxy signer. In this work we propose two one-time proxy signature schemes with different security properties. Unlike other existing one-time proxy signatures that are constructed from public key cryptography, our proposed schemes are based one-way functions without trapdoors and so they inherit the communication and computation efficiency from the traditional one-time signatures. Although from a verifier point of view, signatures generated by the proxy are indistinguishable from those created by the primary signer, a trusted authority can be equipped with an algorithm that allows the authority to settle disputes between the signers. In our constructions, we use a combination of one-time signatures, oblivious transfer protocols and certain combinatorial objects. We characterise these new combinatorial objects and present constructions for them.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
This study reports on the utilisation of the Manchester Driver Behaviour Questionnaire (DBQ) to examine the self-reported driving behaviours of a large sample of Australian fleet drivers (N = 3414). Surveys were completed by employees before they commenced a one day safety workshop intervention. Factor analysis techniques identified a three factor solution similar to previous research, which was comprised of: (a) errors, (b) highway-code violations and (c) aggressive driving violations. Two items traditionally related with highway-code violations were found to be associated with aggressive driving behaviours among the current sample. Multivariate analyses revealed that exposure to the road, errors and self-reported offences predicted crashes at work in the last 12 months, while gender, highway violations and crashes predicted offences incurred while at work. Importantly, those who received more fines at work were at an increased risk of crashing the work vehicle. However, overall, the DBQ demonstrated limited efficacy at predicting these two outcomes. This paper outlines the major findings of the study in regards to identifying and predicting aberrant driving behaviours and also highlights implications regarding the future utilisation of the DBQ within fleet settings.
Resumo:
Ubiquitination involves the attachment of ubiquitin to lysine residues on substrate proteins or itself, which can result in protein monoubiquitination or polyubiquitination. Ubiquitin attachment to different lysine residues can generate diverse substrate-ubiquitin structures, targeting proteins to different fates. The mechanisms of lysine selection are not well understood. Ubiquitination by the largest group of E3 ligases, the RING-family E3 s, is catalyzed through co-operation between the non-catalytic ubiquitin-ligase (E3) and the ubiquitin-conjugating enzyme (E2), where the RING E3 binds the substrate and the E2 catalyzes ubiquitin transfer. Previous studies suggest that ubiquitination sites are selected by E3-mediated positioning of the lysine toward the E2 active site. Ultimately, at a catalytic level, ubiquitination of lysine residues within the substrate or ubiquitin occurs by nucleophilic attack of the lysine residue on the thioester bond linking the E2 catalytic cysteine to ubiquitin. One of the best studied RING E3/ E2 complexes is the Skp1/Cul1/F box protein complex, SCFCdc4, and its cognate E2, Cdc34, which target the CDK inhibitor Sic1 for K48-linked polyubiquitination, leading to its proteasomal degradation. Our recent studies of this model system demonstrated that residues surrounding Sic1 lysines or lysine 48 in ubiquitin are critical for ubiquitination. This sequence-dependence is linked to evolutionarily conserved key residues in the catalytic region of Cdc34 and can determine if Sic1 is mono- or poly-ubiquitinated. Our studies indicate that amino acid determinants in the Cdc34 catalytic region and their compatibility to those surrounding acceptor lysine residues play important roles in lysine selection. This may represent a general mechanism in directing the mode of ubiquitination in E2 s.
Resumo:
Thin plate spline finite element methods are used to fit a surface to an irregularly scattered dataset [S. Roberts, M. Hegland, and I. Altas. Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions. SIAM, 1:208--234, 2003]. The computational bottleneck for this algorithm is the solution of large, ill-conditioned systems of linear equations at each step of a generalised cross validation algorithm. Preconditioning techniques are investigated to accelerate the convergence of the solution of these systems using Krylov subspace methods. The preconditioners under consideration are block diagonal, block triangular and constraint preconditioners [M. Benzi, G. H. Golub, and J. Liesen. Numerical solution of saddle point problems. Acta Numer., 14:1--137, 2005]. The effectiveness of each of these preconditioners is examined on a sample dataset taken from a known surface. From our numerical investigation, constraint preconditioners appear to provide improved convergence for this surface fitting problem compared to block preconditioners.
Resumo:
Both environmental economists and policy makers have shown a great deal of interest in the effect of pollution abatement on environmental efficiency. In line with the modern resources available, however, no contribution is brought to the environmental economics field with the Markov chain Monte Carlo (MCMC) application, which enables simulation from a distribution of a Markov chain and simulating from the chain until it approaches equilibrium. The probability density functions gained prominence with the advantages over classical statistical methods in its simultaneous inference and incorporation of any prior information on all model parameters. This paper concentrated on this point with the application of MCMC to the database of China, the largest developing country with rapid economic growth and serious environmental pollution in recent years. The variables cover the economic output and pollution abatement cost from the year 1992 to 2003. We test the causal direction between pollution abatement cost and environmental efficiency with MCMC simulation. We found that the pollution abatement cost causes an increase in environmental efficiency through the algorithm application, which makes it conceivable that the environmental policy makers should make more substantial measures to reduce pollution in the near future.
Resumo:
We investigate whether framing effects of voluntary contributions are significant in a provision point mechanism. Our results show that framing significantly affects individuals of the same type: cooperative individuals appear to be more cooperative in the public bads game than in the public goods game, whereas individualistic subjects appear to be less cooperative in the public bads game than in the public goods game. At the aggregate level of pooling all individuals, the data suggests that framing effects are negligible, which is in contrast with the established result.
Resumo:
This paper explores the possibility of connecting two Wind Turbine Generators (WTG) to the grid using a single three level inverter. In the proposed system the rectified output of one WTG is connected across the upper dc-link capacitor of a standard diode clamped three level inverter. Similarly the rectified output of the other WTG is connected across the lower capacitor. This particular combination has several advantages such as, direct connection to the grid, reduced parts count, improved reliability and high power capacity. However, the major problem in the proposed system is the imminent imbalance of dc-link voltages. Under such conditions conventional modulation methods fail to produce desired voltage and current waveforms. A detailed analysis on this issue and a novel space vector modulation method, as the solution, are proposed in this paper. To track the Maximum power point of each WTG a power sharing algorithm is proposed. Simulation results are presented to attest the efficacy of the proposed system.
Resumo:
This study compared proximal femoral morphology in patients living in soft and hard water regions. The proximal femoral morphology of two groups of 70 patients living in hard and soft water regions with a mean age of 72.3 (range 50 to 87 years) were measured using an antero-posterior radiograph of the non-operated hip with magnification adjusted. The medullary canal diameter at the level of the lesser trochanter (LT) was significantly wider in patients living in the hard water region (mean width 1.9 mm wider; p= 0.003). No statistical significant difference was found in the medullary canal width at 10 cm below the level of LT, Dorr index, or Canal Bone Ratio (CBR). In conclusion, the proximal femoral morphology does differ in patients living in soft and hard water areas. These results may have an important clinical bearing in patients undergoing total hip replacement surgery. Further research is needed to determine whether implant survivorship is affected in patients living in hard and soft water regions.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.