168 resultados para data movement problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A composite SaaS (Software as a Service) is a software that is comprised of several software components and data components. The composite SaaS placement problem is to determine where each of the components should be deployed in a cloud computing environment such that the performance of the composite SaaS is optimal. From the computational point of view, the composite SaaS placement problem is a large-scale combinatorial optimization problem. Thus, an Iterative Cooperative Co-evolutionary Genetic Algorithm (ICCGA) was proposed. The ICCGA can find reasonable quality of solutions. However, its computation time is noticeably slow. Aiming at improving the computation time, we propose an unsynchronized Parallel Cooperative Co-evolutionary Genetic Algorithm (PCCGA) in this paper. Experimental results have shown that the PCCGA not only has quicker computation time, but also generates better quality of solutions than the ICCGA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background When large scale trials are investigating the effects of interventions on appetite, it is paramount to efficiently monitor large amounts of human data. The original hand-held Electronic Appetite Ratings System (EARS) was designed to facilitate the administering and data management of visual analogue scales (VAS) of subjective appetite sensations. The purpose of this study was to validate a novel hand-held method (EARS II (HP® iPAQ)) against the standard Pen and Paper (P&P) method and the previously validated EARS. Methods Twelve participants (5 male, 7 female, aged 18-40) were involved in a fully repeated measures design. Participants were randomly assigned in a crossover design, to either high fat (>48% fat) or low fat (<28% fat) meal days, one week apart and completed ratings using the three data capture methods ordered according to Latin Square. The first set of appetite sensations was completed in a fasted state, immediately before a fixed breakfast. Thereafter, appetite sensations were completed every thirty minutes for 4h. An ad libitum lunch was provided immediately before completing a final set of appetite sensations. Results Repeated measures ANOVAs were conducted for ratings of hunger, fullness and desire to eat. There were no significant differences between P&P compared with either EARS or EARS II (p > 0.05). Correlation coefficients between P&P and EARS II, controlling for age and gender, were performed on Area Under the Curve ratings. R2 for Hunger (0.89), Fullness (0.96) and Desire to Eat (0.95) were statistically significant (p < 0.05). Conclusions EARS II was sensitive to the impact of a meal and recovery of appetite during the postprandial period and is therefore an effective device for monitoring appetite sensations. This study provides evidence and support for further validation of the novel EARS II method for monitoring appetite sensations during large scale studies. The added versatility means that future uses of the system provides the potential to monitor a range of other behavioural and physiological measures often important in clinical and free living trials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problem-based learning (PBL) has been used successfully in disciplines such as medicine, nursing, law and engineering. However a review of the literature shows that there has been little use of this approach to learning in accounting. This paper extends the research in accounting education by reporting the findings of a case study of the development and implementation of PBL at the Queensland University of Technology (QUT) in a new Accountancy Capstone unit that began in 2006. The fundamentals of the PBL approach were adhered to. However, one of the essential elements of the approach adopted was to highlight the importance of questioning as a means of gathering the necessary information upon which decisions are made. This approach can be contrasted with the typical ‘give all the facts’ case studies that are commonly used. Another feature was that students worked together in the same group for an entire semester (similar to how teams in the workplace operate) so there was an intended focus on teamwork in solving unstructured, real-world accounting problems presented to students. Based on quantitative and qualitative data collected from student questionnaires over seven semesters, it was found that students perceived PBL to be effective, especially in terms of developing the skills of questioning, teamwork, and problem solving. The effectiveness of questioning is very important as this is a skill that is rarely the focus of development in accounting education. The successful implementation of PBL in accounting through ‘learning by doing’ could be the catalyst for change to bring about better learning outcomes for accounting graduates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Previous studies have shown that fundamental movement skills (FMS) and physical activity are related. Specifically, earlier studies have demonstrated that the ability to perform a variety of FMS increases the likelihood of children participating in a range of physical activities throughout their lives. To date, however, there have not been studies focused on the development of, or the relationship between, these variables through junior high school (that is, between the ages of 13 and 15). Such studies might provide important insights into the relationships between FMS and physical activity during adolescence, and suggest ways to design more effective physical education programmes for adolescents. Purpose: The main purposes of the study are: (1) to investigate the development of the students' self-reported physical activity and FMS from Grade 7 to Grade 9, (2) to analyse the associations among the students' FMS and self-reported physical activity through junior high school, (3) to analyse whether there are gender differences in research tasks one and/or two. Participants and setting: The participants in the study were 152 Finnish students, aged 13 and enrolled in Grade 7 at the commencement of the study. The sample included 66 girls and 86 boys who were drawn from three junior high schools in Middle Finland. Research design and data collection: Both the FMS tests and questionnaires pertaining to self-reported physical activity were completed annually during a 3 year period: in August (when the participants were in Grade 7), January (Grade 8), and in May (Grade 9). Data analysis: Repeated measures multivariate analysis of variances (MANOVAs) were used to analyse the interaction between gender and time (three measurement points) in FMS test sumscores and self-reported physical activity scores. The relationships between self-reported physical activity scores and fundamental movement skill sumscores through junior high school were analysed using Structural Equation Modelling (SEM) with LISREL 8.80 software. Findings: The MANOVA for self-reported physical activity demonstrated that both genders' physical activity decreased through junior high school. The MANOVA for the FMS revealed that the boys' FMS sumscore increased whereas the girls' skills decreased through junior high school. The SEM and squared multiple correlations revealed FMS in Grades 7 and 8 as well as physical activity in Grade 9 explained FMS in Grade 9. The portion of prediction was 69% for the girls and 55% for the boys. Additionally, physical activity measured in Grade 7 and FMS measured in Grade 9 explained physical activity in Grade 9. The portion of prediction was 12% for the girls and 29% for the boys. In the boys' group, three additional paths were found; FMS in Grade 7 explained physical activity in Grade 9, physical activity in Grade 7 explained FMS in Grade 8, and physical activity in Grade 7 explained physical activity in Grade 8. Conclusions: The study suggests that supporting and encouraging FMS and physical activity are co-related and when considering combined scores there is a greater likelihood of healthy lifelong outcomes. Therefore, the conclusion can be drawn that FMS curriculum in school-based PE is a plausible way to ensure good lifelong outcomes. Earlier studies support that school physical education plays an important role in developing students FMS and is in a position to thwart the typical decline of physical activity in adolescence. These concepts are particularly important for adolescent girls as this group reflects the greatest decline in physical activity during the adolescent period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To examine the effect of thermal agents on the range of movement (ROM) and mechanical properties in soft tissue and to discuss their clinical relevance. DATA SOURCES: Electronic databases (Cochrane Central Register of Controlled Trials, MEDLINE, and EMBASE) were searched from their earliest available record up to May 2011 using Medical Subjects Headings and key words. We also undertook related articles searches and read reference lists of all incoming articles. STUDY SELECTION: Studies involving human participants describing the effects of thermal interventions on ROM and/or mechanical properties in soft tissue. Two reviewers independently screened studies against eligibility criteria. DATA EXTRACTION: Data were extracted independently by 2 review authors using a customized form. Methodologic quality was also assessed by 2 authors independently, using the Cochrane risk of bias tool. DATA SYNTHESIS: Thirty-six studies, comprising a total of 1301 healthy participants, satisfied the inclusion criteria. There was a high risk of bias across all studies. Meta-analyses were not undertaken because of clinical heterogeneity; however, effect sizes were calculated. There were conflicting data on the effect of cold on joint ROM, accessory joint movement, and passive stiffness. There was limited evidence to determine whether acute cold applications enhance the effects of stretching, and further evidence is required. There was evidence that heat increases ROM, and a combination of heat and stretching is more effective than stretching alone. CONCLUSIONS: Heat is an effective adjunct to developmental and therapeutic stretching techniques and should be the treatment of choice for enhancing ROM in a clinical or sporting setting. The effects of heat or ice on other important mechanical properties (eg, passive stiffness) remain equivocal and should be the focus of future study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change and land use pressures are making environmental monitoring increasingly important. As environmental health is degrading at an alarming rate, ecologists have tried to tackle the problem by monitoring the composition and condition of environment. However, traditional monitoring methods using experts are manual and expensive; to address this issue government organisations designed a simpler and faster surrogate-based assessment technique for consultants, landholders and ordinary citizens. However, it remains complex, subjective and error prone. This makes collected data difficult to interpret and compare. In this paper we describe a work-in-progress mobile application designed to address these shortcomings through the use of augmented reality and multimedia smartphone technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-world AI systems have been recently deployed which can automatically analyze the plan and tactics of tennis players. As the game-state is updated regularly at short intervals (i.e. point-level), a library of successful and unsuccessful plans of a player can be learnt over time. Given the relative strengths and weaknesses of a player’s plans, a set of proven plans or tactics from the library that characterize a player can be identified. For low-scoring, continuous team sports like soccer, such analysis for multi-agent teams does not exist as the game is not segmented into “discretized” plays (i.e. plans), making it difficult to obtain a library that characterizes a team’s behavior. Additionally, as player tracking data is costly and difficult to obtain, we only have partial team tracings in the form of ball actions which makes this problem even more difficult. In this paper, we propose a method to overcome these issues by representing team behavior via play-segments, which are spatio-temporal descriptions of ball movement over fixed windows of time. Using these representations we can characterize team behavior from entropy maps, which give a measure of predictability of team behaviors across the field. We show the efficacy and applicability of our method on the 2010-2011 English Premier League soccer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report presents a snapshot from work which was funded by the Queensland Injury Prevention Council in 2010-11 titled “Feasibility of Using Health Data Sources to Inform Product Safety Surveillance in Queensland children”. The project provided an evaluation of the current available evidence-base for identification and surveillance of product-related injuries in children in Queensland and Australia. A comprehensive 300 page report was produced (available at: http://eprints.qut.edu.au/46518/) and a series of recommendations were made which proposed: improvements in the product safety data system, increased utilisation of health data for proactive and reactive surveillance, enhanced collaboration between the health sector and the product safety sector, and improved ability of health data to meet the needs of product safety surveillance. At the conclusion of the project, a Consumer Product Injury Research Advisory group (CPIRAG) was established as a working party to the Queensland Injury Prevention Council (QIPC), to prioritise and advance these recommendations and to work collaboratively with key stakeholders to promote the role of injury data to support product safety policy decisions at the Queensland and national level. This group continues to meet monthly and is comprised of the organisations represented on the second page of this report. One of the key priorities of the CPIRAG group for 2012 was to produce a snapshot report to highlight problem areas for potential action arising out of the larger report. Subsequent funding to write this snapshot report was provided by the Institute for Health and Biomedical Innovation, Injury Prevention and Rehabilitation Domain at QUT in 2012. This work was undertaken by Dr Kirsten McKenzie and researchers from QUT's Centre for Accident Research and Road Safety - Queensland. This snapshot report provides an evidence base for potential further action on a range of children’s products that are significantly represented in injury data. Further information regarding injury hazards, safety advice and regulatory responses are available on the Office of Fair Trading (OFT) Queensland website and the Product Safety Australia websites. Links to these resources are provided for each product reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient state asset management is crucial for government departments that rely on the operations of their state assets in order to fulfil their public functions, which include public service provision and others. These assets may be expensive, extensive and or, complex, and can have a major impact on the ability of governments to perform its function over extended periods. Various governments around the world have increasingly recognised the importance of an efficient state asset management laws, policies, and practices; exemplified by the surge in state asset management reform. This phenomenon is evident in Indonesia, in particular through the establishment of the Directorate General of State Assets in 2006, who was appointed as the ultimate state asset manager (of Republic of Indonesia) and the proprietor of state asset management reform. The Directorate General of State Assets too has pledged its adherence to good governance principles within its state asset management laws and policies reform. However the degree that good governance principles are conceptualised is unknown, resulting in questions of how and to what extent is good governance principles evident within Indonesia's reformed state asset management laws and policies. This study seeks to understand the level of which good governance principles are conceptualised and understood within reformed state asset management policies in Indonesia (as a case study), and identify the variables that play a role in the implementation of said reform. Although good governance improvements has been a central tenet in Indonesian government agenda, and state asset management reform has propelled in priority due to found neglect and unfavourable audit results; there is ambiguity in regards to the extent that good governance is conceptualised within the reform, how and whether this relationship is understood by state asset managers (i.e government officials), and what (and how) other variables play a supporting and/or impeding role in the reform. Using empirical data involving a sample of four Indonesian regional governments and 70 interviews; discrepancy in which good governance principles are conceptualised, the level it is conceptualised, at which stage of state asset management practice it is conceptualised, and the level it is understood by state asset managers (i.e government officials) was found. Human resource capacity and capability, the notion of 'needing more time', low legality, infancy of reform, and dysfunctional sense of stewardship are identified as specific impeding variables to state asset management reform; whilst decentralisation and regional autonomy regime, political history, and culture play a consistent undercurrent key role in good governance related reforms within Indonesia. This study offers insights to Indonesian policy makers interested in ensuring the conceptualisation and full implementation of good governance in all areas of governing, particularly within state asset management practices. Most importantly, this study identifies an asymmetry in good governance understanding, perspective, and assumptions between policy maker (i.e high level government officials) and policy implementers (i.e low level government officials); to be taken into account for future policy evolvements and/or writing. As such, this study suggests the need for a modified perspective and approach to good governance conceptualisation and implementation strategies, one that acknowledges and incorporates a nation's unique characteristics and no longer denies the double-edged sword of simplified assumptions of governance.