894 resultados para Many-to-many-assignment problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The maintenance of functional physical fitness across the lifespan depends upon the presence or absence of disease, injury, and the level of habitual physical activity. The prevalence of sedentariness rises with increasing age culminating in 31% of elderly women being classified as leading a sedentary lifestyle. Exercise prescription that involves easily accomplished physical activity may result in the maintenance of mobility into old age through a reduction in the risk of premature death and disablement from cardiovascular disease and a reduction in the risk of falls and injuries from falls. It may be that short bouts of physical activity are more appealing to the sedentary and to those in full time employment than longer bouts, and it may be that short bouts of exercise, performed three times per day, can improve physical fitness. The purpose of this study was therefore to examine the problem: Does exercise session duration, initial cardiovascular fitness, and age group effect changes in functional physical fitness in sedentary women training for strength, flexibility and aerobic fitness? Twenty-three, sedentary women aged between 19 and 54 years who were employed at a major metropolitan hospital undertook six weeks of moderate intensity physical activity in one of two training groups. Participants were randomly allocated to either short duration (3 x 10 minute), or long duration (30 minute), exercise groups. The 3 x 10 minute group (n=13), participated in three, 10 minute sessions per day separated by at least 2 hours, 3 days per week. The 30 minute group (n=10), participated in three 30 minute sessions per week. The total amount of work was similar, with an average of 129 and 148 kcal training day for the 3 x 10 minute and 30 minute groups, respectively. The training program incorporated three walking and stair climbing courses for aerobic conditioning, a series of eleven static stretches for joint flexibility, and isotonic and isometric strength exercises for lower and upper body muscular strength. Measures of functional strength, functional flexibility and cardiovascular fitness were assessed prior to training, and immediately following the six week exercise program. A two way analysis of variance (Group x Time) was used to examine the effect of training and group on the dependent variables. The level of significance, 0.05 was adopted for all statistical tests. Mean hand grip strength showed for both groups no significant change over time for the 3 x 10 minute group (30.7kg to 31.7kg) and 30 minute group (30.2kg to 32.4kg). Leg strength showed a trend for improvement (p=0.098) in both the 3 x 10 minute and 30 minute training groups representing a 15% and 18% improvement, respectively. Combined right and left neck rotation significantly improved in the 3 x 10 minute group (82.8° to 92.0°) and 30 minute group (82.5° to 91.5°). Wrist flexion and extension improved significantly in 3 out of the 4 measurements. Left wrist flexion improved significantly by an average of 7.0% for the 3 x 10 minute and 4.9% for the 30 minute group. Right and left wrist extension improved significantly in the 3 x 10 minute and 30 minute training groups (5.9% and 6.8%, respectively). Hip and spine flexibility improved 3.4cm (35.2cm to 38.6cm) in the 3 x 10 minute group, and 6.6cm (37.4cm to 44.0cm) in the 30 minute group. There was a significant improvement in cardiovascular fitness for both groups representing a 22% improvement in the 3 x 10 minute group (27.2 to 33.2 ml kg min), and a 25% improvement in the 30 minute group (27.5 to 34.4 ml -kg min). No significant difference was shown in the degree of improvement in cardiovascular fitness over six weeks of training for subjects of either low or moderate initial aerobic fitness. Grip strength showed no significant changes over time for either the young-aged (19-35 years) or middle-aged (36-54 years) groups. Leg strength showed a trend for improvement (p=0.093) in the young-aged group (63.5kg to 71.9kg) and middle-aged group (69.3kg to 85.8kg). Neck rotation flexibility improved a similar amount in both the young and middle aged groups representing an improvement of 9.9° and 8.0° respectively. There was significant improvement in two of the four measures of wrist flexibility. Hip and spine flexibility was significantly greater in the young-aged group compared to the middle-aged group (38.5cm and 30.7cm, respectively). There was a significant improvement in hip and spine flexibility over the six week training program representing an increase in reach of 6.5cm for the young age group and 4.9cm for the older group. The middle-aged subjects had significantly lower cardiovascular fitness than their younger peers, scoring 22.8 and 30.7 ml -kg min, respectively. Cardiovascular fitness improved a similar amount in both age groups representing a significant improvement of 23.8% and 28.1% for the younger-aged and middle-aged subjects, respectively. The findings of this study suggest that short bouts of exercise may be equally as effective as longer bouts of exercise for improving the flexibility and cardiovascular components of functional physical fitness in sedentary young and middle aged women. Additionally short bouts of exercise may be more attractive than longer bouts of exercise for the beginning exerciser as they may more easily fit into the busy lifestyle encountered by many people in today's society. Sedentary young and middle-aged women should benefit from static flexibility exercises designed to improve and/or maintain functional flexibility and thus maintain mobility and reduce the incidence of muscular injury. Regular, brisk walking, incorporating some stair climbing, is likely to be beneficial in improving cardiovascular health and perhaps also in improving leg strength, thereby helping to improve and maintain functional physical fitness for both young and middle-aged sedentary women.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent emergence of intelligent agent technology and advances in information gathering have been the important steps forward in efficiently managing and using the vast amount of information now available on the Web to make informed decisions. There are, however, still many problems that need to be overcome in the information gathering research arena to enable the delivery of relevant information required by end users. Good decisions cannot be made without sufficient, timely, and correct information. Traditionally it is said that knowledge is power, however, nowadays sufficient, timely, and correct information is power. So gathering relevant information to meet user information needs is the crucial step for making good decisions. The ideal goal of information gathering is to obtain only the information that users need (no more and no less). However, the volume of information available, diversity formats of information, uncertainties of information, and distributed locations of information (e.g. World Wide Web) hinder the process of gathering the right information to meet the user needs. Specifically, two fundamental issues in regard to efficiency of information gathering are mismatch and overload. The mismatch means some information that meets user needs has not been gathered (or missed out), whereas, the overload means some gathered information is not what users need. Traditional information retrieval has been developed well in the past twenty years. The introduction of the Web has changed people's perceptions of information retrieval. Usually, the task of information retrieval is considered to have the function of leading the user to those documents that are relevant to his/her information needs. The similar function in information retrieval is to filter out the irrelevant documents (or called information filtering). Research into traditional information retrieval has provided many retrieval models and techniques to represent documents and queries. Nowadays, information is becoming highly distributed, and increasingly difficult to gather. On the other hand, people have found a lot of uncertainties that are contained in the user information needs. These motivate the need for research in agent-based information gathering. Agent-based information systems arise at this moment. In these kinds of systems, intelligent agents will get commitments from their users and act on the users behalf to gather the required information. They can easily retrieve the relevant information from highly distributed uncertain environments because of their merits of intelligent, autonomy and distribution. The current research for agent-based information gathering systems is divided into single agent gathering systems, and multi-agent gathering systems. In both research areas, there are still open problems to be solved so that agent-based information gathering systems can retrieve the uncertain information more effectively from the highly distributed environments. The aim of this thesis is to research the theoretical framework for intelligent agents to gather information from the Web. This research integrates the areas of information retrieval and intelligent agents. The specific research areas in this thesis are the development of an information filtering model for single agent systems, and the development of a dynamic belief model for information fusion for multi-agent systems. The research results are also supported by the construction of real information gathering agents (e.g., Job Agent) for the Internet to help users to gather useful information stored in Web sites. In such a framework, information gathering agents have abilities to describe (or learn) the user information needs, and act like users to retrieve, filter, and/or fuse the information. A rough set based information filtering model is developed to address the problem of overload. The new approach allows users to describe their information needs on user concept spaces rather than on document spaces, and it views a user information need as a rough set over the document space. The rough set decision theory is used to classify new documents into three regions: positive region, boundary region, and negative region. Two experiments are presented to verify this model, and it shows that the rough set based model provides an efficient approach to the overload problem. In this research, a dynamic belief model for information fusion in multi-agent environments is also developed. This model has a polynomial time complexity, and it has been proven that the fusion results are belief (mass) functions. By using this model, a collection fusion algorithm for information gathering agents is presented. The difficult problem for this research is the case where collections may be used by more than one agent. This algorithm, however, uses the technique of cooperation between agents, and provides a solution for this difficult problem in distributed information retrieval systems. This thesis presents the solutions to the theoretical problems in agent-based information gathering systems, including information filtering models, agent belief modeling, and collection fusions. It also presents solutions to some of the technical problems in agent-based information systems, such as document classification, the architecture for agent-based information gathering systems, and the decision in multiple agent environments. Such kinds of information gathering agents will gather relevant information from highly distributed uncertain environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is ample evidence that in many countries school science is in difficulty, with declining student attitudes and uptake of science. This presentation argues that a key to addressing the problem lies in transforming teachers’ classroom practice, and that pedagogical innovation is best supported within a school context. Evidence for effective change will draw on the School Innovation in Science (SIS) initiative in Victoria, which has developed and evaluated a model to improve science teaching and learning across a school system. The model involves a framework for describing effective teaching and learning, and a strategy that allows schools flexibility to develop their practice to suit local conditions and to maintain ownership of the change process. SIS has proved successful in improving science teaching and learning in primary and secondary schools. Experience from SIS and related projects, from a national Australian science and literacy project, and from system wide science initiatives in Europe, will be used to explore the factors that affect the success and the path of innovation in schools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowing what to do with the massive amount of data collected has always been an ongoing issue for many organizations. While data mining has been touted to be the solution, it has failed to deliver the impact despite its successes in many areas. One reason is that data mining algorithms were not designed for the real world, i.e., they usually assume a static view of the data and a stable execution environment where resources are abundant. The reality however is that data are constantly changing and the execution environment is dynamic. Hence, it becomes difficult for data mining to truly deliver timely and relevant results. Recently, the processing of stream data has received many attention. What is interesting is that the methodology to design stream-based algorithms may well be the solution to the above problem. In this entry, we discuss this issue and present an overview of recent works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variation in the incoming sheet material and fluctuations in the press setup is unavoidable in many stamping plants. The effect of these variations can have a large influence on the quality of the final stamping, in particular, unpredictable springback of the sheet when the tooling is removed. While stochastic simulation techniques have been developed to simulate this problem, there has been little research that connects the influence of the noise sources to springback. This paper characterises the effect of material and process variation on the robustness of springback for a semi-cylindrical channel forming operation, which shares a similar cross-section profile as many automotive structural components. The study was conducted using the specialised sheet metal forming package AutoFormTM Sigma, for which a series of stochastic simulations were performed with each of the noise sources incrementally introduced. The effective stress and effective strain scatter in a critical location of the part was examined and a response window, which indicates the respective process robustness, was defined. The incremental introduction of the noise sources allows the change in size of the stressstrain response window to be tracked. The results showed that changes to process variation parameters, such as BHP and friction coefficient, directly affect the strain component of the stressstrain response window by altering the magnitude of external work applied to forming system. Material variation, on the other hand, directly affected the stress component of the response window. A relationship between the effective stressstrain response window and the variation in springback was also established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimizing broadcasting process in mobile ad hoc network (MANET) is considered as a main challenge due to many problems, such as Broadcast Storm problem and high complexity in finding the optimal tree resulting in an NP-hard problem. Straight forward techniques like simple flooding give rise to Broadcast Storm problem with a high probability. In this work, genetic algorithm (GA) that searches over a population that represents a distinguishable ‘structure’ is adopted innovatively to suit MANETs. The novelty of the GA technique adopted here to provide the means to tackle this MANET problem lies mainly on the proposed method of searching for a structure of a suitable spanning tree that can be optimized, in order to meet the performance indices related to the broadcasting problem. In other words, the proposed genetic model (GM) evolves with the structure of random trees (individuals) ‘genetically’ generated using rules that are devised specifically to capture MANET behaviour in order to arrive at a minimal spanning tree that satisfies certain fitness function. Also, the model has the ability to give different solutions depending on the main factors specified such as, ‘time’ (or speed) in certain situations and ‘reachability’ in certain others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determination of the optimal operating condition for moulding process has been of special interest for many researchers. To determine the optimal setting, one has to derive the model of injection moulding process first which is able to map the relationship between the input process control factors and output responses. One of most popular modeling techniques is the linear least square regression due to its effectiveness and completeness. However, the least square regression was found to be very sensitive to the outliers and failed to provide a reliable model if the control variables are highly related with each other. To address this problem, a new modeling method based on principal component regression was proposed in this paper. The distinguished feature of our proposed method is it does not only consider the variance of covariance matrix of control variables but also consider the correlation coefficient between control variables and target variables to be optimised. Such a modelling method has been implemented into a commercial optimisation software and field test results demonstrated the performance of the proposed modelling method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anonymous web browsing is a hot topic with many potential applications for privacy reasons. The current dominant strategy to achieve anonymity is packet padding with dummy packets as cover traffic. However, this method introduces extra bandwidth cost and extra delay. Therefore, it is not practical for anonymous web browsing applications. In order to solve this problem, we propose to use the predicted web pages that users are going to access as the cover traffic rather than dummy packets. Moreover, we defined anonymity level as a metric to measure anonymity degrees, and established a mathematical model for anonymity systems, and transformed the anonymous communication problem into an optimization problem. As a result, users can find tradeoffs among anonymity level and cost. With the proposed model, we can describe and compare our proposal and the previous schemas in a theoretical style. The preliminary experiments on the real data set showed the huge potential of the proposed strategy in terms of resource saving.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many plants that are now recognised as weeds are incredibly beautiful and it is no wonder they have been used to adorn home gardens. Unfortunately, they can naturalise in the environment once they have escaped and cause many problems. Ornamental species form about two thirds of our environmental weeds. This paper outlines why weeds are a problem, the characteristics that allow weeds to become a problem and provides a brief glimpse of the mode of introduction of weeds to Australia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Watermarking technique enables to hide an imperceptible watermark into a multimedia content for copyright protection. However, in most conventional watermarking schemes, the watermark is embedded solely by the seller, and both the seller and the buyer know the watermarked copy, which causes unsettled dispute at the phase of arbitration. To solve this problem, many watermarking protocols have been proposed using watermarking scheme in the encrypted domain. In this paper, we firstly discuss many security aspects in the encrypted domain, and then propose a new method of homomorphism conversion for probabilistic public key cryptosystem with homomorphic property. Based on our previous work, a new secure watermarking scheme for watermarking protocol is presented using a new embedding strategy in the encrypted domain. We employ an El Gamal variant cryptosystem with additive homomorphic property to reduce the computing overload of watermark embedding in the encrypted domain, and RA code to improve the robustness of the watermarked image against many moderate attacks after decryption. Security analysis and experiment demonstrate that the secure watermarking scheme is more suitable for implementing the existing watermarking protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light scattering from small spherical particles has applications in a vast number of disciplines including astrophysics, meteorology optics and particle sizing. Mie theory provides an exact analytical characterization of plane wave scattering from spherical dielectric objects. There exist many variants of the Mie theory where fundamental assumptions of the theory has been relaxed to make generalizations. Notable such extensions are generalized Mie theory where plane waves are replaced by optical beams, scattering from lossy particles, scattering from layered particles or shells and scattering of partially coherent (non-classical) light. However, no work has yet been reported in the literature on modifications required to account for scattering when the particle or the source is in motion relative to each other. This is an important problem where many applications can be found in disciplines involving moving particle size characterization. In this paper we propose a novel approach, using special relativity, to address this problem by extending the standard Mie theory for scattering by a particle in motion with a constant speed, which may be very low, moderate or comparable to the speed of light. The proposed technique involves transforming the scattering problem to a reference frame co-moving with the particle, then applying the Mie theory in that frame and transforming the scattered field back to the reference frame of the observer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper takes issue with the 'disabling' of students enrolled in teacher education courses, perpetrated by definitions of students' learning disorders and by the structures and pedagogies engaged by teacher educators. Focusing on one case, but with relevance for similarly affected systems, the paper begins by outlining the changed student entry credentials of Australian universities and their faculties of education. These are seen as induced by a shift from elite to mass provision of higher education and the particular effect on teacher education providers (especially those located in regional institutions) of the politics of government funding and the continuing demand for teachers by education systems. While these changed conditions are often used to argue an increased university population of students with learning disorders, the paper suggests that such arguments often have more to do with how student problems are defined by institutions and how these definitions serve to secure additional government funding. More pertinently, the paper argues that such definition tends to locate the problem in individual students, deferring considerations of teacher educators' pedagogy and the learning arrangements of their institutions. The paper concludes that the place to begin addressing these issues of difficulty would seem to be with a different conception of knowledge production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Content authenticity and correctness is one of the important challenges in eLearning as there can be many solutions for one specific problem in the cyber space. Therefore, we feel the necessity of mapping problem to solutions using graph partition and weighted bipartite matching. This paper presents a novel architecture and methodology for a personal eLearning system called PELS that is developed by us. We also present an efficient algorithm to partition question-answer (QA) space and explore best possible solution to a particular problem. Our approach can be efficiently applied to social eLearning space where there is one-to-many and many-to-many relationship with a level of bonding. The main advantage of our approach is that we use QA ranking by adjusted edge weights provided by subject matter experts (SME) or expert database. Finally, we use statistical methods called confidence interval and hypothesis test on the data to check the reliability and dependability of the quality of results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is devoted to a combinatorial problem for incidence semirings, which can be viewed as sets of polynomials over graphs, where the edges are the unknowns and the coefficients are taken from a semiring. The construction of incidence rings is very well known and has many useful applications. The present article is devoted to a novel application of the more general incidence semirings. Recent research on data mining has motivated the investigation of the sets of centroids that have largest weights in semiring constructions. These sets are valuable for the design of centroid-based classification systems, or classifiers, as well as for the design of multiple classifiers combining several individual classifiers. Our article gives a complete description of all sets of centroids with the largest weight in incidence semirings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Although the guaranteed superannuation system is believed by many to provide a safe and adequate source of funds in retirement, some will be unpleasantly surprised. The aim of this paper is to demonstrate the significant effect of the economic cycle on the final accumulated balance in superannuation retirement accounts. Method: A Monte Carlo simulation is used to illustrate the variance in outcomes that can be expected for a hypothetical individual. Results: The expected accumulated superannuation balances for two hypothetical individuals are estimated. The spread of outcomes is used to illustrate the problem of using only the mean of the distribution as a predictor of wealth in the retirement years. Conclusions: Many retirees rely on superannuation to fund their retirement. However regular contributions to superannuation does not ensure a predictable outcome, and active management of contributions is required if retirement goals are to be met.