955 resultados para Quadratic, sieve, CUDA, OpenMP, SOC, Tegrak1
Resumo:
The Environmental Kuznets Curve (EKC) hypothesises an inverse U-shaped relationship between a measure of environmental pollution and per capita income levels. In this study, we apply non-parametric estimation of local polynomial regression (local quadratic fitting) to allow more flexibility in local estimation. This study uses a larger and globally representative sample of many local and global pollutants and natural resources including Biological Oxygen Demand (BOD) emission, CO2 emission, CO2 damage, energy use, energy depletion, mineral depletion, improved water source, PM10, particulate emission damage, forest area and net forest depletion. Copyright © 2009 Inderscience Enterprises Ltd.
Resumo:
The number of students in special schools has increased at a rapid rate in some Australian states, due in part to increased enrolment under the categories of emotional disturbance (ED) and behaviour disorder (BD). Nonetheless, diagnostic distinctions between ED and BD are unclear. Moreover, despite international findings that students with particular backgrounds are over-represented in special schools, little is known about the backgrounds of students entering such settings in Australia. This study examined the government school enrolment data from New South Wales, the most populous of the Australian states. Linear and quadratic trends were used to describe the numbers and ages of students enrolled in special schools in the ED and BD categories. Changes between 1997 and 2007 were observed. Results showed an over-representation of boys that increased across the decade and a different pattern across age for boys and girls. Consistent with international findings, these results indicate that trends in special school placements are unrelated to disability prevalence in the population. Rather, it is suggested that schools act to preserve time and resources for others by removing their more challenging students: most typically, boys.
Resumo:
We compare three alternative methods for eliciting retrospective confidence in the context of a simple perceptual task: the Simple Confidence Rating (a direct report on a numerical scale), the Quadratic Scoring Rule (a post-wagering procedure), and the Matching Probability (MP; a generalization of the no-loss gambling method). We systematically compare the results obtained with these three rules to the theoretical confidence levels that can be inferred from performance in the perceptual task using Signal Detection Theory (SDT). We find that the MP provides better results in that respect. We conclude that MP is particularly well suited for studies of confidence that use SDT as a theoretical framework.
Resumo:
Energy efficient embedded computing enables new application scenarios in mobile devices like software-defined radio and video processing. The hierarchical multiprocessor considered in this work may contain dozens or hundreds of resource efficient VLIW CPUs. Programming this number of CPU cores is a complex task requiring compiler support. The stream programming paradigm provides beneficial properties that help to support automatic partitioning. This work describes a compiler for streaming applications targeting the self-build hierarchical CoreVA-MPSoC multiprocessor platform. The compiler is supported by a programming model that is tailored to fit the streaming programming paradigm. We present a novel simulated-annealing (SA) based partitioning algorithm, called Smart SA. The overall speedup of Smart SA is 12.84 for an MPSoC with 16 CPU cores compared to a single CPU implementation. Comparison with a state of the art partitioning algorithm shows an average performance improvement of 34.07%.
Resumo:
Current developments in gene medicine and vaccination studies are utilizing plasmid DNA (pDNA) as the vector. For this reason, there has been an increasing trend towards larger and larger doses of pDNA utilized in human trials: from 100-1000 μg in 2002 to 500-5000 μg in 2005. The increasing demand of pDNA has created the need to revolutionalize current production levels under optimum economy. In this work, different standard media (LB, TB and SOC) for culturing recombinant Escherichia coli DH5α harbouring pUC19 were compared to a medium optimised for pDNA production. Lab scale fermentations using the standard media showed that the highest pDNA volumetric and specific yields were for TB (11.4 μg/ml and 6.3 μg/mg dry cell mass respectively) and the lowest was for LB (2.8 μg/ml and 3.3 μg/mg dry cell mass respectively). A fourth medium, PDMR, designed by modifying a stoichiometrically-formulated medium with an optimised carbon source concentration and carbon to nitrogen ratio displayed pDNA volumetric and specific yields of 23.8 μg/ml and 11.2 μg/mg dry cell mass respectively. However, it is the economic advantages of the optimised medium that makes it so attractive. Keeping all variables constant except medium and using LB as a base scenario (100 medium cost [MC] units/mg pDNA), the optimised PDMR medium yielded pDNA at a cost of only 27 MC units/mg pDNA. These results show that greater amounts of pDNA can be obtained more economically with minimal extra effort simply by using a medium optimised for pDNA production.
Resumo:
The degradation efficiencies and behaviors of caffeic acid (CaA), p-coumaric acid (pCoA) and ferulic acid (FeA) in aqueous sucrose solutions containing the mixture of these hydroxycinnamic acids (HCAs) mixtures were studied by the Fenton oxidation process. Central composite design and multi-response surface methodology were used to evaluate and optimize the interactive effects of process parameters. Four quadratic polynomial models were developed for the degradation of each individual acid in the mixture and the total HCAs degraded. Sucrose was the most influential parameter that significantly affected the total amount of HCA degraded. Under the conditions studied there was < 0.01% loss of sucrose in all reactions. The optimal values of the process parameters for a 200 mg/L HCA mixture in water (pH 4.73, 25.15 °C) and sucrose solution (13 mass%, pH 5.39, 35.98 °C) were 77% and 57% respectively. Regression analysis showed goodness of fit between the experimental results and the predicted values. The degradation behavior of CaA differed from those of pCoA and FeA, where further CaA degradation is observed at increasing sucrose and decreasing solution pH. The differences (established using UV/Vis and ATR-FTIR spectroscopy) were because, unlike the other acids, CaA formed a complex with Fe(III) or with Fe(III) hydrogen-bonded to sucrose, and coprecipitated with lepidocrocite, an iron oxyhydroxide.
Resumo:
We consider online prediction problems where the loss between the prediction and the outcome is measured by the squared Euclidean distance and its generalization, the squared Mahalanobis distance. We derive the minimax solutions for the case where the prediction and action spaces are the simplex (this setup is sometimes called the Brier game) and the \ell_2 ball (this setup is related to Gaussian density estimation). We show that in both cases the value of each sub-game is a quadratic function of a simple statistic of the state, with coefficients that can be efficiently computed using an explicit recurrence relation. The resulting deterministic minimax strategy and randomized maximin strategy are linear functions of the statistic.
Resumo:
The expansion of creative and cultural industries has provided a rich source for theoretical claims and commentary. Much of this reproduces and extends the idea that autonomy is the defining feature of both enterprises and workers. Drawing on evidence from research into Australian development studios in the global digital games industry, the article interrogates claims concerning autonomy and related issues of insecurity and intensity, skill and specialisation, work–play boundaries, identity and attachments. In seeking to reconnect changes in creative labour to the wider production environment and political economy, an argument is advanced that autonomy is deeply contextual and contested as a dimension of the processes of capturing value for firms and workers.
Resumo:
Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.
Resumo:
The objective of this study was to determine the influence of lactose carrier size on drug dispersion of salmeterol xinafoate (SX) from interactive mixtures. SX dispersion was measured by using the fine particle fractions determined by a twin stage impinger attached to a Rotahaler1. The particle size of the lactose carrier in the SX interactive mixtures was varied using a range of commercial inhalation-grade lactoses. In addition, differing size fractions of individual lactose samples were achieved by dry sieving. The dispersion ofSXappeared to increase as the particle size of the lactose carrier decreased for the mixtures prepared from different particle size commercial samples of lactose and from different sieve fractions of the same lactose. Fine particles of lactose (<5 mm) associated with the lactose carrier were removed from the carrier surface by a wet decantation process to produce lactose samples with low but similar concentrations of fine lactose particles. The fine particle fractions of SX in mixtures prepared with the decanted lactose decreased significantly (analysis of variance, p<0.001) and the degree of dispersion became independent of the volume mean diameter of the carriers (analysis of variance, p<0.05). The dispersion behavior is therefore associated with the presence of fine adhered particles associated with the carriers and the inherent size of the carrier itself has little influence on dispersion.
Resumo:
There is an increasing demand for Unmanned Aerial Systems (UAS) to carry suspended loads as this can provide significant benefits to several applications in agriculture, law enforcement and construction. The load impact on the underlying system dynamics should not be neglected as significant feedback forces may be induced on the vehicle during certain flight manoeuvres. The constant variation in operating point induced by the slung load also causes conventional controllers to demand increased control effort. Much research has focused on standard multi-rotor position and attitude control with and without a slung load. However, predictive control schemes, such as Nonlinear Model Predictive Control (NMPC), have not yet been fully explored. To this end, we present a novel controller for safe and precise operation of multi-rotors with heavy slung load in three dimensions. The paper describes a System Dynamics and Control Simulation Toolbox for use with MATLAB/SIMULINK which includes a detailed simulation of the multi-rotor and slung load as well as a predictive controller to manage the nonlinear dynamics whilst accounting for system constraints. It is demonstrated that the controller simultaneously tracks specified waypoints and actively damps large slung load oscillations. A linear-quadratic regulator (LQR) is derived and control performance is compared. Results show the improved performance of the predictive controller for a larger flight envelope, including aggressive manoeuvres and large slung load displacements. The computational cost remains relatively small, amenable to practical implementations.
Resumo:
In this paper, we use an experimental design to compare the performance of elicitation rules for subjective beliefs. Contrary to previous works in which elicited beliefs are compared to an objective benchmark, we consider a purely subjective belief framework (confidence in one’s own performance in a cognitive task and a perceptual task). The performance of different elicitation rules is assessed according to the accuracy of stated beliefs in predicting success. We measure this accuracy using two main factors: calibration and discrimination. For each of them, we propose two statistical indexes and we compare the rules’ performances for each measurement. The matching probability method provides more accurate beliefs in terms of discrimination, while the quadratic scoring rule reduces overconfidence and the free rule, a simple rule with no incentives, which succeeds in eliciting accurate beliefs. Nevertheless, the matching probability appears to be the best mechanism for eliciting beliefs due to its performances in terms of calibration and discrimination, but also its ability to elicit consistent beliefs across measures and across tasks, as well as its empirical and theoretical properties.
Resumo:
Index tracking is an investment approach where the primary objective is to keep portfolio return as close as possible to a target index without purchasing all index components. The main purpose is to minimize the tracking error between the returns of the selected portfolio and a benchmark. In this paper, quadratic as well as linear models are presented for minimizing the tracking error. The uncertainty is considered in the input data using a tractable robust framework that controls the level of conservatism while maintaining linearity. The linearity of the proposed robust optimization models allows a simple implementation of an ordinary optimization software package to find the optimal robust solution. The proposed model of this paper employs Morgan Stanley Capital International Index as the target index and the results are reported for six national indices including Japan, the USA, the UK, Germany, Switzerland and France. The performance of the proposed models is evaluated using several financial criteria e.g. information ratio, market ratio, Sharpe ratio and Treynor ratio. The preliminary results demonstrate that the proposed model lowers the amount of tracking error while raising values of portfolio performance measures.